var/home/core/zuul-output/0000755000175000017500000000000015136156503014532 5ustar corecorevar/home/core/zuul-output/logs/0000755000175000017500000000000015136161276015501 5ustar corecorevar/home/core/zuul-output/logs/kubelet.log.gz0000644000175000017500000225561215136161123020262 0ustar corecoreSxikubelet.log_o[;r)Br'o b-n(!9t%Cs7}g/غIs,r.k9GfD p ?YI_翪|mvşo#oVݏKf+ovpZjl!Kޒ/h3_.gSeq5v(×_~^ǿq]n>߮}+ԏbś E^"Y^-Vۋz7wH׋0g"ŒGǯguz|ny;#)a "b BLc?^^4[ftlR%KF^j 8DΆgS^Kz۞_W#|`zIlp_@oEy5 fs&2x*g+W4m ɭiE߳Kfn!#Šgv cXk?`;'`&R7߿YKS'owHF6":=3Ȑ 3xҝd){Ts}cZ%BdARO#-o"D"ޮrFg4" 0ʡPBU[fi;dYu' IAgfPF:c0Ys66q tH6#.`$vlLH}ޭA㑝V0>|J\Pg\W#NqɌDSd1d9nT#Abn q1J# !8,$RNI? j!bE"o j/o\E`r"hA ós yi\[.!=A(%Ud,Sc̝G?ƥF%QV5pDVHwԡ/.2h{qۀK8yUOdssdMvw`21ɻ]/ƛ"@8(PN_,_0;o_x+Vy<h\dN9:bġ7 -Pwȹl;M@n̞Qj_P\ Q]GcPN;e7Vtś98m1<:|a+.:a4nՒ,]LF0);I$>ga5"f[B[fhT/ɾgm\Sj#3hEEH*Nf äE@O0~y[쾋t=iYhșC 5ܩa!ǛfGtzz*з 55E9Fa?Zk80ݞN|:AОNo;Ⱦzu\0Ac/T%;m ~S`#u.Џ1qNp&gK60nqtƅ": C@!P q]G0,d%1}Uhs;H?)M"뛲@.Cs*H _0:P.BvJ>mIyVVTF% tFL-*$tZm2AČAE9ϯ~ihFf&6,֗&̴+s~x?53!}~Z[F)RH?uvͪ _5l *7h?cF_]CNnW)F5d,0SSNK9ް4:ÒozsB<^+鄌4:B%cXhK I}!5 YM%o<>"ہ)Za@Ι}YJz{ɛr|hxY/O$Zøu32EʉD'MS1}t i:Y`cФIX0$lη˽`!i:ګPSPٔ3@5;ȕ}PkڪH9' |":", 1Ҫ8 %lg&:2JC!Mjܽ#`PJWP4Q2:IGӸۡshN+60#:mufe߿~Y,iǑ wVq*T+ w%fx6 %u̩1hӰc%AYW ZY~a_6_yWf`rVA,f=A}h&V<UKkZ{iqi :íy˧FR1u)X9 f΁U ~5batx|ELU:T'T[G*ݧ ؽZK̡O6rLmȰ (T$ n#b@hpj:˾ojs)M/8`$:) X+ҧSaۥzw}^P1J%+P:Dsƫ%z; +g 0հc0E) 3jƯ?e|miȄwfm#Y~!%rpWMEWMjbn(ek~iQ)à[h2yrOpY]Q4`Iz_*2coT'ƟlQ.Ff!bpRw@\6"yr+i37Z_j*YLfnYJ~Z~okJʯrΒz+ϟMn?!ԫ5H&=JkܓhkB\LQ"<LxeLo4l_m24^3.{nɼʪ~75/nQοs d|pxu\uw?=QR -Mݞίk@Pc n1æ*m$=4Dbs+J \EƄզ}@۶(꧟/ۼ𹫘qݎt7Ym݃|M$ 6.x5 TMXbXj-P\jА޴y$j`ROA"?Yg"ź)\3mf|ܔMUiU|Ym! #'ukMmQ9Blm]TO1ba.XW x6ܠ9[v35H;-]Um4mMrW-k#~fؤϋu_j*^Wj^qM `-Pc.@%=X#|ۡb1lKcj$E^%nu;v}?@Nkj)n'^52&I pѴOw4ǫJ5H 7B`j:E]`C 8蟫n'Ą6[_  'Z! ,Z.maO_Bk/m~-Qy2$?T3ͤE^긊mۘ$XD.bͮW`AީClСw5/lbl[N*t*@56."D/< {uVUKe$,\ܺI `Qز@UӬ@B {~}Qg?lvחzäTC 4zv)|Vy7߯@qC cN ͯ~1-b }kAn=)m 3fo˶_ XJNC5B~m3Kx6BDhvxZn8hSlz z6^Q1* _> 8A@>!a:dC<mWu[7-D[9)/*˸PP!j-7BtK|VXnT&eZc~=31mס̈'K^r,w˲vtt|,S=[qɑ)6&vד4G&%JLi[? 1A ۥ͟յt9 ",@9 P==s 0py(nWDwpɡ`i?E1Q!:5*6@q\\YWTk sspww0SZ2, uvao=\Sl Uݚu@$Pup՗з҃TXscwqRtYڢLhw KO5C\-&-qQ4Mv8pS俺kCߤ`ZnC@{GP 9::3(6e™QG7'Dff^f!8:/p6>TV*P,rq<-mOK[[ߢm=ȑt^, tJbظ&Pg%㢒\QS܁vk$}  L&T+̔6vmEl 05 D"w|=U D(C{oVa*H7MQK"<O%MTTtx袥:2JޚݶKd7UQJͮݔ\Zťz;sh4BΈ l8f(q*72"DB&&-TeD1ZrbkI%8z}ݛwu0{ѩ2ْM4tޖӫgHKT~~[= LfZ eWzRSrkICd ûQÝBsN&4KG&ƫEJި_1N`Ac2 GP)"nD&D #-aGoz%<ѡh (jF9L`fMN]eʮ"3_qOZ釋rTG_7:0@Iuʙ?&Ԕ8e,žLG"1lͧQѶGM]}yxZl 0JM"d.=`Yƚ^"J?}>8ϵq\FOXƀf qbTLhlw?8p@/u7ި!:E#s:ic.XC^wT/]n2'>^&pnapckL>2QQWo/ݳ<̍8)r`F!Woc0Xq]P)\wEZ(VҠQBT^e^0F;)CtT+{`Bh"% !.bBQPnT4ƈRa[F=3}+BVE~8R{3,>0|:,5j358W]>!Q1"6oT[ҟT;725Xa+wqlR)<#!9!籈K*:!@NI^S"H=ofLx _lp ꖚӜ3C 4dM @x>ۙZh _uoֺip&1ڙʪ4\RF_04H8@>>fXmpLJ5jRS}D ?U4x[c) ,`̔Dvckk5Ťã0le۞]o~oW>(91ݧ$uxp/Cq6Un9%ZxðvGL qG $ X:w06 E=oWlzN7st˪C:?*|kިfc]| &ب^[%F%LI<0(씖;4A\`TQ.b0NH;ݹ/n -3!: _Jq#Bh^4p|-G7|ڸ=Bx)kGU/m >6JXa5FA@ q}4BooRe&#c5t'B6Ni/~?aX9QR5'%9hb,dsPn2Y??N M<0YaXJ)?ѧ| ;&kEYhjo?BOy)O˧?G6$g!D$c=5ۄX[ു RzG:ߺ[ӏ[3frl ô ހ^2TӘUAT!94[[m۾\T)W> lv+ H\FpG)*־%NǸ*kb05 V8[l?W]^@G:{N-i bɵFWǙ*+Ss*iނL8G+mj(^>c/"ɭex^k$# $V :]PGszyEZ]DaUS@''mhSt6"+ҶT M6rN+LxE>^DݮEڬTk1+trǴ5RHİ{qJ\}X` >+%ni3+(0m8HЭ*zAep!*)jxG:Up~gfu#x~ .2ןGRLIۘT==!TlN3ӆv%#oV}N~ˊc,_,=COU C],Ϣa!L}sy}u\0U'&2ihbvz=.ӟk ez\ƚO; -%M>AzzGvݑT58ry\wW|~3Ԟ_f&OC"msht: rF<SYi&It1!ʐDN q$0Y&Hv]9Zq=N1/u&%].]y#z18m@n1YHR=53hHT( Q(e@-#!'^AK$wTg1!H$|HBTf̋ Y@Mwq[Fī h[W,Ê=j8&d ԋU.I{7O=%iG|xqBչ̋@1+^.r%V12, _&/j"2@+ wm 4\xNtˆ;1ditQyc,m+-!sFɸv'IJ-tH{ "KFnLRH+H6Er$igsϦ>QKwҰ]Mfj8dqV+"/fC Q`B 6כy^SL[bJgW^;zA6hrH#< 1= F8) 򃟤,ŏd7>WKĉ~b2KQdk6՛tgYͼ#$eooԦ=#&d.09DHN>AK|s:.HDŽ">#%zNEt"tLvfkB|rN`)81 &ӭsēj\4iO,H̎<ߥ諵z/f]v2 0t[U;;+8&b=zwɓJ``FiQg9XʐoHKFϗ;gQZg܉?^_ XC.l.;oX]}:>3K0R|WD\hnZm֏op};ԫ^(fL}0/E>ƥN7OQ.8[ʔh,Rt:p<0-ʁקiߟt[A3)i>3Z i򩸉*ΏlA" &:1;O]-wgϊ)hn&i'v"/ͤqr@8!̴G~7u5/>HB)iYBAXKL =Z@ >lN%hwiiUsIA8Y&=*2 5I bHb3Lh!ޒh7YJt*CyJÄFKKùMt}.l^]El>NK|//f&!B {&g\,}F)L b߀My6Õw7[{Gqzfz3_X !xJ8T<2!)^_ďǂ.\-d)Kl1헐Z1WMʜ5$)M1Lʳsw5ǫR^v|t$VȖA+Lܑ,҂+sM/ѭy)_ÕNvc*@k]ן;trȫpeoxӻo_nfz6ؘҊ?b*bj^Tc?m%3-$h`EbDC;.j0X1dR? ^}Ծե4NI ܓR{Omu/~+^K9>lIxpI"wS S 'MV+Z:H2d,P4J8 L72?og1>b$]ObsKx̊y`bE&>XYs䀚EƂ@K?n>lhTm' nܡvO+0fqf٠r,$/Zt-1-dė}2Or@3?]^ʧM <mBɃkQ }^an.Fg86}I h5&XӘ8,>b _ z>9!Z>gUŞ}xTL̵ F8ՅX/!gqwߑZȖF 3U>gCCY Hsc`% s8,A_R$קQM17h\EL#w@>omJ/ŵ_iݼGw eIJipFrO{uqy/]c 2ėi_e}L~5&lҬt񗽐0/λL[H* JzeMlTr &|R 2ӗh$cdk?vy̦7]Ạ8ph?z]W_MqKJ> QA^"nYG0_8`N 7{Puٽ/}3ymGqF8RŔ.MMWrO»HzC7ݴLLƓxxi2mW4*@`tF)Ċ+@@t޹na4p9/B@Dvܫs;/f֚Znϻ-MHVuV_K2k*`cKxuBG&24T}Lai 0Va(7K#ӊ!,ZDxFQO*lם>!4ӥ2 ]8â6 U`V%`!c%؎ʨTzrKh! c.}.D>)d_ 8rcu,wf2?Ǡ*_lDn}rauyFp*ɨ:UiM2r:9ct X1lmĪ o玓,R%!`hGT LYF#g<cm${|Xdu4tmtїUJ\~dc0KcMlf2?mμQ ߉J4WrSHTdp"ӹ'cJq2zPlX̯.0H!ND@UapVoGڧD5>H]f@!=߸2V%Z 0"G4ȇʩ@]>Y$ًF_Mm_Tt)ib+q&EXFu򾬳ǝ/RS>r,C2NfOjpcm{Ll9vQOT>9U;])>6JdbXԠ `Z#_+D[7IIjJɟUh ҙ"`"a ߒ"G̾H`6yiCk(OA/$ ^%K^+(Vr[RR1"u4A.1X0=7f/"(o9/L1X{]q`Ȝ/; 9a>E)XOS K9mUxBa"'4T[Jl /K/9,rlCAj_TiǘP,:4F%_0E5IE'rX-|_W8ʐ/=ӹjhO%>| :S Px„*3_y.g9| ;b`w NtZtc> ײ1KĴ{3Gl& KT1ZWX8?C]~We$9; -.D087?1a@P5B,c}jcGȱ WW/ @a#LA4.ٹ^XڋXٝ:^Izq. ٽƎDn6ٹBc5Lt;3#i3RAٽ9| cbpcTfp> 6L/_x 'ۙz7~w~);qU9GDT! 6]c_:VlnEUdn6UˇKU;V`JUݵޙEO[)ܶCy*8¢/[cչjx&? ՃJȚ9!j[~[' "ssTV2i sLq>z@JM->=@NỲ\쀜*/) ̞r21.y? bO]3?C!yw3ޯL_Su>o>&lrw&i"< :]_<<7U_~z5є/rfn͝MLmc 6&)e+n7cyy{_~궼07R7wPuqpqo{ߟ+[w_uOq?u-|?WS_tOq?Eu-L_p?Cz .e ϿO*3 `Ђ6a-`kIf-s,RL-R`1eL~dپ&+IhYRczr?㐟,v~,b6)up)3K,RLW"Qd9JgT\1f3@Kh% a4x,kA k ^d kYj5Ah𚄓vXZhX1xҖ51Y +Id ZZ\C| fD>hB֡#-$+Jpሟ,Cg:6 3 xH "}C[`ӨOAFn5ʬLHϰ:N@VcyBI#Dr. "h hg ۃm-qu>V&൘ G7qi#^tҒ[JI!{q*lrD܇Gk@;oI<5xZ4xM"؇'k!>V|lk'{d+ :sXӄc)?W`*|\v aVT0"tMًcΒVz]T.C$cEp._0M`AlF̤@U' u,—rw=3}resLV&ԙy=Ejl1#XX۾;R;+[$4pjfљ lݍ3)`xvcZRT\%fNV Q)nsX }plMa~;Wi+f{v%Ζ/K 8WPll{f_WJ|8(A ä>nl"jF;/-R9~ {^'##AA:s`uih F% [U۴"qkjXS~+(f?TT)*qy+QR"tJ8۷)'3J1>pnVGITq3J&J0CQ v&P_񾅶X/)T/ϧ+GJzApU]<:Yn\~%&58IS)`0効<9ViCbw!bX%E+o*ƾtNU*v-zߞϢ +4 {e6J697@28MZXc Ub+A_Aܲ'SoO1ۀS`*f'r[8ݝYvjҩJ;}]|Bޙǖߔ 3\ a-`slԵ怕e7ːزoW|A\Qu&'9~ l|`pΕ [Q =r#vQu0 M.1%]vRat'IIc(Irw~Z"+A<sX4*X FVGA<^^7 vq&EwQű:؁6y\QbR9GuB/S5^fa;N(hz)}_vq@nu@$_DVH|08W12e_ʿd{xlzUܝlNDU j>zƖݗ&!jC`@ qэ-V Rt2m%K6dX)"]lj齔{oY:8VmS!:Wh#O0} :OVGL.xllT_oqqqLec2p;Ndck[ Rh6T#0H Q}ppS@ώ@#gƖ8sѹ e^ CZLu+."T#yrHhlكʼE-X'I^=bKߙԘ1"+< gb`[c1髰?(o$[eR6uOœ-m~)-&>883\6y 8V -qrG]~.3jsqY~ sjZ+9[rAJsT=~#02ݬf¸9Xe>sY~ ae9} x* zjC.5Wg󵸊y!1U:pU!ƔCm-7^w]斻~[hW$k sE0ڊSq:+EKٕ|dvvjjy6 æ/ML-yz,ZlQ^oAn-})xǺǍ--qcl:WLg ӁvJ[ǧc~Of+8qpçco#rCtKӫce0!Y-+cxMK-H_2:Uu*_}l_nKN`#m~R:ߙ歼!IZ5>H;0ޤ:\Tq]_\_>e˲\oUQ\Wߋ47WwߋKpwSSۘF,nC.\UߋoVEuY]^VW0R=<ު˜˻ x}[ێ'|;c^ M7 >5\-> m-8NJ\ALd!>_:h/NAC;?_ξqĎ6xMY(=ͯl~l8V0٨T zL{Ac:&$ ^CpH*DW\r2aR|=(L X1|wrO_g ux1^^V2޲jMi^b``Q#dBxV#NBk1;DAV$"*1]Y~ d->'I`{6} %y``'1Nxg2I6%y /=oIɒci+frp&Ůu+^أ$i,2'=%2VIE[ jSdd;ͷ̏U r/n Ǫ H` Q *xGybTؔ4ƪ(ph#RbTꢬ6TqVܞ$_~}T?Dȷ̏V D$Ze0VXuͣqrAX% d|O OINIѩ:7ݗƗiK4z|U_7{Y\궰_WӌPwŕnjueq!p&0KsmCK00Bi30xT5%ϊj樖ZvNW4]9g]IW>{^,8?*%ĈAM 't/3~jÓp^DZ˛ǹd.3^ng37L/H'xI6Rd1CZk %}| 4/I_SS@jwzkeh*3űp}}#yPT}D;2$^aϦS2Be̸P սͩjRA_E5jAa+ ` 0M`m_B!ߟ$<^KvXotf*MdDH~]yeXb||,?d+Kr-Rd2B-dnpw8yU3%-QXͬ1 Žy34{z*/Fs^-?B1eTTzzJ^dmTQgZhضCqtS15Pt0@34u4F ;t\f~63IVy g]4|=(Qs겈33/JiyU<YribC~`Z"ak: -󉢬YS P(H<@ "Ot^MI br1OLPˈOt֤̅\_Gj1WRoXs!&׳;d~9xyYΚϷX*V{#$̊Ibjc铥:ِRUJP_B~Yf``6*`~(QH% Oºĝ?Y4@*_rQ vQ|Cϣ¹zDa K14??煇x[?<)< zZ?NHȣ/٘퉪m|ojYkO+yiڗcJ<28AF9xȠKx+VrƝvѪP[A˝l](8j8IAݪ]_aa$ŧ\"$H6tXCy#82aOaÀh}d$^\Q(M%2%StJ;Uv)RJx,<<y\E`P-x%jICbEVt.sxp~ܸSW4>%ш$1wGSwd8Q9}x0ˋ̟U]xV@p%3EРl燠c{F 8d>~MU6 @mˈDk4yÃeX:h^ kVl~%G` _`4jLQ/[Y0هV=\&)k`ѓX㨬Łe?i )RJݗyʬp|r2p\¯u L"]tK};<%y7wov[= i,=!~cվ@'̹OfDʆ2zM !\|ýZR؆ q#!s8p&'67uRUDIm`6NJ)J6Z+ P j̱ «sXPʂcbOq獇D_ϺȶOSq < %غ@V=ݿe"uKB}εW]WPw@%e!`+.{quH<~jX2tXۓ:bxS{^޿DVM/G k/9fB(U2jx'gtƾT- re.H$\J0 Ad:D42IPR1ՊnZ+ҏtk qĀ6D]9Q,9HJ.RM6:'y[ -`Nc[ " 1DV|S3H[M,j˒UNWn:ЮILVSJ\6,g1kiDnM/"NC_K8k?'Yyψ2ԵMˉd4OY&:x%6]Kjv˒ ,QJ+<γ'eLj.wnJFѬ&>֭,h?>Ae!ł,ǟN0>@8yIvONƚЀ?w#q8 xl6" P_oxS?axsAd2kUr}8!clFNs\w_3RxbCSb[Pf8kθVbâ3g w7#||~% Gc}٢\'5gfnW2Q^7@'>{߽t?S] s6+μ{M~:։_iox(H< dˎl1s&$] ,q.WQ tv]Q\'4byV_ۮF_շOT\^"#xJg))A}Wߚ-UVU}_e '3gi\ ,xS߻W(!X'qUWJZE!ĭIkOk%t`pW ~.`xT,w-`]A>/#:<@X ;x~!BwuLJ4 y'NN"/~ ( ]XG@1ZwjkC(&,n?"JVIًa^?NA gyE{b,7=܊}ѝ^lT!zs~Jt _p@W4>,#/n@[4 x=1#O}#@0~_Bo! ؛`}PZ3#('WL9X|~:;2Py FwN x ԞHpߡ/[Qx gTG3"ʆ@Cnr ~w(~d(A(a?22|?@p}^ 0\rx (:Eaς~i4A±. !*4w48?yKFwZC;hhpԱ yw>AmzAp}FN :uU3c}h>YH[5 YoP|7018G89g1+ @߫J8^GAuT8餟˦-7I8g3㳞xtiO&oy`a:& WGťO֩`OyN$D:gSk`W԰a?atrET]n+j1:\E dsH n@3H zT sxy^[ܾ~y~T\wsz2/?oSv ux |O%A]qڭ"|5< T؈l^'YBN @8 K4@2ocn(<߁PKQ<,o2߫e FH߲lQh`c4Km**PʲgxZ{?K ^|_AT; X&\/?aoM-tPt{B$nC.)}D'*}lҭvnճڧ;weԲ-eʞF*{$l BuB-u'}# u ʷ oO(-넊-*Fx$b BuB-'{# _'߂P{B?P BuB- '4x# 4\'4܂p{Bç>p+ڸ0Y[ x@͟fՌH?(V7~|:YhΤY٬8x*O;@̫D}U( h=;c$&qplplM3>xg(Eϲ]Ք?%^H̫ϖ*p B>@è !p-s蠽+޹p jbҫ*-DZ(ZrSyXd8/dd<4~X?8'~?ʼ@m1i1Pv~QR`5 T,Q?.}V5FcjGwh[bH1+KdC?_Qޜ&};[8/I[Cu0&y A uPqM[+ ƽ/+0YH H+)w'<-k."嶔2¥ 6E_x붿i * qVpIeZYѢpzͭP͗l\IP{ּHe/Zl9nHKh94+㉍O0*+MAL{D?qe%(&7tL@aԠPQz w˪"RRe,614\ \IHpv|v%8*:hp䎫!PL[k~@ go# A;e“!|k6|ܱ`k#T =);iny'㼪@s8Y3B;k5OcW:0?L\9w U-tXO̚GEzo4Ɲk@[ߜl{w0!66޾/%ꝅYugZֆEݸD@_"E\bFuWǔJK  0kv<^\u_U(m5Vnr]9Tf9JI=b& ӮuIG"Q>ߐ%[.*r蠘l Dz:&NRE%4t P0+= k!boy=Iu=KRB?4i\Y`V hy_aʧ-#n9TœgO #^sF0bJ):^V.r * ~ÀhXʈ{ VꀾZ0Ӻ҂|5yd]wrdt l>(/E~ T-2 }XkXe[S@ב%|9nZ@qĠOLPJ2!e=9 pR-ae'Rf*gԮ/y1lUñnz3 [?.u 0rCFcCqg$S`vҥ (+x nvPցY8s 0W΄],D[}PphxɦNQs,"K]eF{JNHfEQba֥/ɮty5Fx=ghf9QL)N/{Spdt@R;Fj\9#תH<^$|D[}*lȰd2;閎(n ,JS')vA6!P[%.9'UR_vfT5l=n~giN3_# MS9^߻D;ov/./p7b3w1y^h&vNd-d~_L #zJ_Y̳+e}9*Ѣ.CҒ- =%70/Xu}T'o5#cք7nBܧMBe *?G }δ!~b/?Oדqu{RzA3Ж2{!L^mQ67pVvd#DZ'9h_죭eKV֋j\mO9ZU諾108d}#.3h}#ytعe덏P#n[p*{ |HWCH{o=1Sq49m}tt"WgW&W4)p|MaQcFV V&I@M:[z).2E <R;fpho`<NT8K- ^Ҍ)`BBרiso5IpIY90\0+t,ZjNRCϹMLT"浼W@\T-sj:-4*ٚ N]JD?a@08sVz @ߖ }~p4Ͻ Jf<~ѣȆ\t)UD)JvZDKhTU2ؔIdh>S$7cYwH6m~NN8t*Nݚs WbMEƪ!Xb8K`Ѣa0džnC.f!BRİyx/VtRШ02yW g "iEQpWomitt ==1%%>MP>Uo0=5.ZYJ}[12>դB5O5&I0hVvQ4q^ϵ&ГxGnXR}]IGrt́cTsP뵬=X4,dחV6iE)EnrJP[5M;%H1iX0W 5 M`kt`x \ю2E?8kkCw.Iтz DQ/.iJ"V/kua[n*kUP*QXZԪ/ډHb<.iK*! *YR+)jIHÓÝCsSJdQMh9+ZoK_"ǧ,ܴZBw@t0d)5aruirn,hLѯG׺~K*9ҲDuxUxeaܟ$ƯQHzi`eθfEkd \ Y+tP֢%GfUY(*䛟Z᪢:<-}="Elפlξb'up /$.)0N MhB)H57vK9W(C"l﷒Y ;{Ld&gugV:(.fNL:I2A2hyҏ I=5S&ǰo$j`I'v.e>1drS$(v^8eނlF&bN-#0LgδדN-M 9҇(\4?u]I]ɡC]H3 \|eM!Fa-Ysƚ`גX9H:tYpZ,1ʩr򛷓 =\ꆏ!lc ٣jDBt̥< YxА@5L(Nf$;uzAN>G7~jGR,NfqSṬ>J&礡a-jinInq[*ŋVvi}5^bE>(yz,jL{PuG8*p&l"Rlx/kWxz.Rx]}{,9f6@R$1y~6mHik>ijZНP>Ѫgg!xP@[/xV,:HbwMeqr:WS"f.C/,6@đ /F8{zn{'߇3q߇VR2Qˋ'/_s%iz ɧQU')Ra.ɍ]&MJt-tWn̂cԗ̫Z<rrEsYłdoYp絻eQv]{'/5< ЧyXcAl7,pNOsF'D(ߏ&1bS%I/"g$(wz}^myNw g΅BŜm+D[^G&5Djƍ7w\j,~۫5/3(Sw1`o[k)]rY+hUh!QUKIǍ]nFRRh@\߱`;sq7^G¬1]r×1`t%4i:mЕU} п+=onzwo?^p69uw%c~kFWFj|F!\uNiGF18,MLoc|K =7l`5}tv? N,^nQ;J/zbжc|xR#fdA ZfSbo5UUb6Y|{|i};$[Nm^gZR)_zx#HLf:5VS;ZIHۯ-nٶq:Uj6HI_Mh=\߰8ߣT/RpʅZTg WAZZmI6X~SmLvC*ZktRdx1Z*DCtcmްO\ߣdך fRicNɺhdtV: b6݋Zh.y[ ONU`>L )u鋤n H:[';nxwI68 jB⭻0^~*b% lmk &^ɗߩ\QX73!tS;4( 8+D>kJ*E:M#`;њ(8fɆw cJy^+m~秝gq^S&j^EfnxqUݘDLW$3VxaјKW\?+ _mP| Jt۰ .'c\s\*eW|G=9ztb=~ '@3e6 JV><>˧.y00gyDةT߯& 77vvb+DyV럢mSv27FezzY@skYJt[KB=bIM*^sVH'cK\)ujkg9XCf|ఫghVZ^K1eĄNeu^hG79^^:$(ͬ1 wtlL w G0.ɁAu5sR$msqr!jD+d@=7Ƴ>,ZF c1 $VA{.,IhY^_Ib􈻵!d00c|RPQH ukt-iZ%Ȅ-͡Hz+*zZE'#uBl;9f̿R d漦G,fyO=:!^Xpc^ QrGO甈gK.VPɏ8^cqrwƒ]odF'0z0!GӥT pf}dхKmf16dzi9%aL8%Wۮ,vɴGʂ: F EVZQ)(/ri*G [œV/[yQXޡWN2ٛNduRv5]h.M*yѢD"]e3"Q>[yѳe,sXs09d.LNuBQR= n.lР.&IG◛ GOݺhi"-޹Z}nIlWn96tZ z2Ve)10aoaAn2ܩER_>ZnxA'-K3' tr٣ޏ08Ҡ8@jߝȬDĜv+Rk0\.)0NcT[h)Mgk# ػmeW=w@P=۳IP$ֱ},{AJrLImGc%ƒ3g!v{>! :(!pSgw-fץb{7ݍjfkmH,\1 (rx1 oҰ 1xl|:OgVI*\1t pG}Y6]N0Zgӕ Юo;Hߤ*5`(X#< ;0-L)#kn>Γpd]Np9OrpP}J֙^+NȚejgVRfuA-gEA?~^\{׬ʼn31'okq>j"ڶM$RH ti @ZtK>NBɏBIRJy,D} ȧ7 7hR4J εuh|8 - ^n: N; 򀱌[r{o -X86֏$?>pkyݛ Wpwn IIì *2*^2(':F Fe?O;v>)>N,!O368Wo#d䛉{liY8z ʮoJ#T+\ N>HwGIfe P7 lUWx3EMK\â{rAcpcwc4\(N{m'u9SWdd tS&X3aB1DNJ/Y>Hox֓»GGqnD(Sڄ].y7 F5glaŌ=அŮhh }`!;̹62UޞJ!T}){8J:(܏4cT4j^h1卿 SbߏnJFTH0~Qaz -Y?nRIs4^TI 8(VxuÒ?4S9~891/c-HF qpDA{%Wu@mv2OThpS2VEG2\¾Y ВY@Ṙi7o&BǩnG޸$,%( ӳW3Tӻo,}0 cGhԻ-tV' W#;H P'4;T]}e t= ՃM#{$OcK(uY&Ap \ZgS@=s;,ZtsYzR}\c24g3:Imx'3׍آp18-/w KI^(6վ 9'b?7&EvUG3y&? Ѐpa,wscj/jvӮH{9=2쬓T1ȗd^q`\ٸ9v`&2M`VDs2wϒ.G5r'|~ 37N]utQjRz~_O0<]SLz` 6R9eisIˤlh0Tf$(%n!gWLz|VyrSaPM]Kɜj< aaNn:am+HV1Gj{!19ܠ݂p5XM{^7ɗKF>TԯAGu}H EOcifOʟő6&Sfe1g;@LuS4н MORiܪ- @\ўHVGq8Xb*ar>fjFR#bT;?-8|PKJ%rcXTLX-wK;,EWQ wo'aE~go㻒9W9Qzf.:EaE'S?\م+;УPxB0Վz=4Bjj+)`V?2Lk3ʆޟV=)|ygJK!uT2γG. ў?Q/YjS,PƝ3R:s r>ZzFV=^ qJb|J*aܢvf|^j 22+ϨC"Rr+5y,WDY{˲5ޣ!lY GVLOfm zfku PgXKn68Aˌu^W35.V=a]|R(,jWJn˱5:^^pÏ|fFHlF1YĥsA_E>@Ž,(ZgII5WjṇWD1: Gl%=pGXZbYeex?hBi⥇.G`̼q~?R#H#/;H ]ˋqA!yT(`/!*v^k+߯ :4tiV.{:dl PO7g1@<9/q^9"x*Po_HQbe_K5hqx6Dɷ_X] _k6gb6ta 6h2) NM+Ek,5:g:"Map^u/5~]y}o{:]HM05/CWfYHZp >N- 0)[ZVؽ(woDhAsC)'k;rɳ Jh;7eM4ቱ.m/qi86>9" ~.J;mw\r<c9?աIM* g3^|\Nh Lew\ƍ?w__G1Uul<Ք` ]I4lyb1F "BYF2S1'چ޴F޴? \U[PRJW#EQ0(IX&)`B%K'8R`Xjֶ 5?2B,:-cU) i2Ŵ"s"d5.#HTQe$Z!,H: >Ve2A׶ 5?z0uczc^lc^H6w覩 IqukTi =yAEM,xmp ƫ7R!v5 ] N?*N2-Y\3>VM[o˷Y{92 e1ǁF_||j:3>ڬ\t PV3'YEN3}B̠{+OgPUH=v̺Ytf2O)ޙ*Y(tRvUJy5oEBG 崋h%~z&M} }! U~s.1Kǿ7CG]*/bq4^vY򒄃x}d8IKDI`GHT[c=M] /OEK0wӭ}iJ逫{v*/tYt4.п̹/"i|M,OC*TMSAquUul:Vo&~Bs('] Ŏqxzɤzg&1DvcQ}kqR =I?0x0ʱ;W)gk=; 3qi=SyBE2cs؜i%`TՉٗ"%XP2ܤbm2ϰ򖹔"u-6̧40@.㷁Qח4Mz9xEq 05r඄k-s*XS[ TKg1neʐvZ웰AH?;@9˨`e0%J҂J4CSÛ {gM*W4eȧN{Jj0WPfiFb")ufp&B2,y9 *V P;~Bqt we=| gLQ曦*3Y!^ajƑab5Dt-пծж1M,Bj361ܢz kM&X[5/֬Euo xV{?'8pu|& BRP ]I4c&0# K!2Z|j4|)|R;Ε"e #-Iaezg@H09 1\F +)J^!G[yL<*6^fƱP+e ;i)`Hf{T Q>C,3\`# 9CcȁkQ*Z(x]K 7D0"]}oF*p4@(py+Ok:(HEQr7dJlRLG,rIvwfvfvfR nʽ] &m=uE{4XjympΞԱD*Dxi I"V$J p/)3+(źBˉE%g6iL*hAG- ZAo^j_R 4G1IdT\#+0Y#!lD*@j aQ-!%8Oe AɏcP %0X ޠY:)'BkHitMk$SOe ʀaF1FH XD\rGSCD5FkkmdS ZlJdM)jϲ Ҹ]JXEԠ%jDID 05m^Uu" \P8tkVôx(5(,Jnc7Sf;#v XK*hT~k[)Ri3cS5XZfR0S"L4 I⊀N)}D#e^\Z{9\Frp!t/-8/^A ;FI&5JFg\$4u XB\TZX;6+u_}lCc :bDhVbzeCP!!iLA:Db(- b-H,V.Ek`g-\VQEԠlKIdH*P{Vˏy"jмij%ʾX!}Hl*GjH1$}ylșnE&20[KJnQILӘ0ʈYIsFhNELMd%VqaB(Ry;ZPd:X C=sgg`#lJ#"MbtL4ijU%ih1ڝ7#P5vpd5&!BTC*o!UJua{XT5@MY`JhĞLykĞVHIϨDk*\+1C LPW;n:q"\0JU;d>v4C R.^|6xDwb"]{-7[xwi37gnj]pbbg=u|_ӐqSҳ L4-=V 6Yn6U Z0^*>qY)qV=]}pUm"!C?2B1f{u #'_OJ8ªJ0I8ڣj?b԰VbhwfBmgِ/uܱũHHzG9U֭>fHc+? 3.ڥ(_)onXs߹VڀSbOT,Hӂ\ڰVS("kq#̝ɝsv8"ndۺ 5Ss]rǏ-UBn:Ŭgp\u41ao{ރ. W*X 3xd O>8e^ tr}1]dg ]0_`&Oc } ׇca0o cz|FN3)KM5맢c)ٶH3Gꊮ>"JR(A\*N,p˟< b8@j3# fm9qy!wF\9q%~|g¡~#-/@]q7ULx`5bw}k e n#oۜ6Qrͳ`Ǡa5A2_xrſV&eK&7 ΒsFPnJ&U4~}>\âw@b0~A`L8LU0Z?O;maVOD91t $ٳ9Ȏqb` Ȟ p0 { 3_"ˍu>JSq\Tsdo_YY/zN傺g~,Ab{Y-ф6ּ`OWY_n 0/}7t:dw0vC| \N5Yuz()ƶ[ \xVa _։»AN7 'opͦǘ7*; ?GpBx:9##g^g$w}#zҝh% i {4 ƭ#Ubs_9ྲྀ&)P|%zr]wY:S^ϯ^c/\tku>JZNvi4Izw<]#0`j[i2`l˹C0g/\,I<#خY] /Nj$T\O6$ÔX2L%ga%T:bB㊥Z@p؋N/8{UL!P)5d*=(zx5~,g Ղ/~\ n;aA30!sp=l38-OIqXw.܎%w]9] l 6EjmN2¾K6Erp%h6-&k`s^jQzp{xV{!Io63-V,fʾ/veN{\%AyI_RVsvB 3~u{o\$.`ܙ5㥓h62~ MFD> r*#ґ1B ',9Cm(|%yԣxjX2T(Уu7 YR($$>kӊTp,TbBMy0_qplNfXR'kH1(24ՐyGa5;Zmȗ[Gq"H2!O- sWH7y/wO>6rbKZ(TYRKYxzьΌ `EPZ+ar|GJ|4P*|RՔzZCqUX DUwS)(|b̷Xj*~xuiuXk,Y@)o qs3?ۄ7!**\${!:σ!%hMDFdcr\^My,`< / y?8.㎓7oJ|4P*|Rrϱ 8poƐ%%g%:)GFd*GE>S弦wyY ĥB=\`92\8MVȘx |ҍ]윿75cLDJ<^4(W9P(*JTڈW9ʳѲ-h4[B].QARrHL mJܒ{Nj|a,`<0{DFZz[گI?F(I.Bi%pZo>nUBUiv/ndB ʚ %00KϳJG<*GTJxlM,Trh~7Xϓ)K|b̷Xj*~b ݟ\|ʥBĈց9S2*!nB?G>ݰz|Z:4P?h~>tJ*qBsi^TvM8(?8nDc?WJ׏@6Lջ6`׻Lpׯ*$2T/VN 夼2Hɘwk&' |-"{K&WG/ `wEP}VZiۧ)Xkˑwp{C p ,;MZőժ΄n/yZ3REz8S#L3:z'G}a|]*ο{d4L}i0E<\k,pv8Ktp O4oBHJfw1O{_{P3YĿ7%+㊅xk" OZ3D 戌>ю1LgVleF: [|j{vDFbr(xv@흌3Ųv#=Ӂvq3[sFduy>|G659$!Um6L4b>M&j/DH_CVPsk7zoH.;x VI᭒XBpޒunFU=u/SwyG99VwvK>QŹ }q^ߪyn 15XYSۦ ׇ@),x"ZJ Vk2"!x k~(򹗨bd<~^&p~_qܱė2ػ$ѐߔڥΉ3[ҀVb= Khx^8(Bă.n*|i* QJu4GSWSupz{;ԆL'@&8C>FI0XEX77(C~G 6,k SPrM^SvQ=ie5-aM E>W墩a+|)5Yq!Q,a+̧CE>W;$[UVs-kYL PETHTT%R[,akޫ87 =y0Ur!Ty䵣 <92¸ D2`ڏ"g. v4~DFY)CÑKl,UxBem)Ϥ'#2z c迡\61P_a48W8?-*CHEZsRB[$GF|D"7]c8 t#kWёR5s F$ 0{A-N/7X6MAM\x"J@ac &hT>[2"GA"Π܉v2|hkʗCKk'˄w|ڈ4f1'Wz3;"aYMo@. 4"c]Ƶ k&isH(x AY }7!`;:X3W S8}y:(oG6ߣ窼t#EGd ȆZz1^,<^y?Cmf*f ab*UzGd6ք<7;oNkA.Q%|y< Z$OA XS<'S1}imFսY9X;eUie] ĖluN>BFdO|dC>S5ms}xĽ,'"8i#+=`_̷KaKh9 tw$.iƸ73χ~\EzL=F;8l@;Z?oOD$ax2.9yJJ 5x|s`]Fona[{[jFUy]bޱCL?'TT˯_;}o(h1M@"n)JY]N9PEi7ia)BE|F7ff~EoAcaϸ7 k16"AOEZBFdmEA! 21i5Z0B" w]i!%l*Rzγw d\VD5I~;jd$v&V?̑9q8˿Sla6Ҧ7QHm7j3";m9Up-w(]hƚVܕ #2Zv-lŁ+4 cni׸]Fakkj7yׄ?\XݶK3y0 󲈾`1n*\򊳇ۻ߬}XSctXLpr+<'>te Mֶ>4iƐ+_{F-hwɸVp$Z C-oIwK6? `l=͝e5бFT9MpSͲv${'6\p^ uJ4jWˇ6XEYU}bk4$-½[(4`a>[QsYey^'> BF[U2x}⠵S!%^wJ~n;!6Ohe (pBxC)z- }yZm?ov|ZopZ+"\ݐJHҌb20k6?iJ~#", h+9E"B$BxI~C>W岭qYry\=UnWe1w1 wTFъ1,rV Vq>%ΙOA%Fd4sVu[ßנ@5Q؄Qu\ں`x5N,\@\ڵ T{3PϱMALF;iIdyW % (q#M.aE>W-s2138-S=LXb.GbaleF&X80ّ=wE>W6*\hja.nuMs\F: `~Ou:8'ZAϞXH܀e-ÖsL+q,!NyC6h)*ݮ.RZ)`܄N ^!fau~O?D#2Zv\ne{f #:md MmTo\| ]X%nBC jDF W^sWae&0m WuF ?7eBu03,D 9hDFBV?ccWU ;L;u~RHQ& N>ɂ`:Lg6ZRHI\7 KB@3^ӣƒ1vqz*- hxsٻm$Ww E\=ndv?&Dir,_Q%Evݦ|sƴ]b𥊬p*my"q7:!Q=<\e1)zJYʭWx/X>^GED8AZN睊wX]8 b=;3Xl݆׽ O( @x":i_ &QQe:lnz.Hj/.<6WWs~egw0K$$ɽOpQGs9f 3DҘ/͊6`uddAb<[8Pm4{:Hٗ n<`]r$D\@N3nᱍ,y`tK.ycd6.6jy3;2|qb*8V>oZWPsl/rq~)v"JEyXBNs~JQ7ۚJ;&S.b$I>% ,6g w%`5`u9 %q~+8%d9O+!Ci|Sq…*j7yEB_l:@%J_$(Cu؉WpG_ V8+7y0ol mY΢utȬ >;ů'Zzx.N"'cuoe)૟߽s35Ur=<.9;甬Y{B.! V@QȔ4&+6P)T9Ē/Gm}!~,pwY)tY/p$/*~4 ; a w}}Jw:# 4E{NFy۽Զ7='Wy-ԧR3A r*YCJƯiYSaF[廢a83essU,GLMnm_v잠[7CƎ(ZQga~7^\vn 蓳}$U}?\?ue9^z/46L/GOm3,fĔz~xO'Xa5 Q2g9&V.kXL'{OӲ/pn JxxfԿT5æӴ|lduĺi>#&&?v;$KD<5"VLX΍j˭9]T"JhҤT|M#N i^h_;aT4IgdIUY}=Y{q}ࢦ_˱]2^$*"/f6&{EOC'Wk F3{=q<Ϋ^<{Wb86c Y8f.lz!\ f5!)ŵN L.g/R-n{!êe}-2J$En@YѹOl SYn{WSWIvcZA4 xldp_-7֊ VqY9S!~2GW&wN7:qA܇eCRi\]^7f>= b&CDϻ .L \(a>Z4ܚy~eu=^?ÿEaqYHa'ü!G㔎TN(=7( ??x rsWt-PxA yŏI ;Iyhn)v.ĽV AV`@Ukw1>Ʈ V@aZmBk~P,bUADa% a%Iina=["ctj18Q arV< )$TR\dnn~Wb5b (!S)w蒭5VpbT 7qKR9z:1tTCT[.STά4$6IӜkw/ YSڃ 1Z!1ML'Y@7C?L6}^ Wb5JP% 2pE6x^_S|/0p<.g_`Z bD3-:d (2)#0ՠ^P`w1-n8x wbNp̀% CȂ[YܨP9gQs֍JKP=AAh_q],d6Cp,-;߷;N漆IU'vgkݻr,:#?]4߿RPhH:J/u%bwl{I : ~1tMHp5DmF77Upօc~pxGm:ZBRC g߲8 PJfJt y}h=S|0E,ӫu,:;:x uV%h| rńTS0yn$<և!,CJN Gc/^PwC9.*kMb *xq٧베`qf.^\"OUȀ+*V~|/cwEdM@u#{YP-CDB]$\ х P3,+XC7Qtpq%\R{ ?Yf"xW=„Jr>s =n4']9a1 &R1u:!x)P錍=R'D{FO9ί ^!QN6M]&k`/"@ωz,4޹GAS|6Zj73:nHirƴx*vdmRô`"n}ɗ23e,"uhm`RuMu2jl{xT3d+n=v^{H0d GS / $]RPLk6X>^<&+5~-btd% f] ;+JRqEu$v[O U- I5 qWSF!=*bIA<ȧi| 萘*Sާ@uq< 5Ǚgz*BVO H)B(g1xt/;4՛.h\I:vm{cNTQ@PR-xF$u&~b GU5W5WHytubST1 r . &>]@wiArMXnAٯs7 k&$8NH.cT"S% oI0$!1r#ufK:VqVىN 1:g% ҐsTdJ cWJH֎TydwR3k5?0XDk1A G_Y>bU< X9l/檋p;]O_Zm%d(#ԑ5@kђ!3Io '1:IƝ?RsX>ŧ }Z6C>j1>R0Q1vL>)њ:x%b eCNnr~J0=ec3MCt97;> kȓv`4|.z 'j0 hNh~lEUNUTd9(FJud2l:-%H~$^ [{!Co[a pP ȫ ]z.6N )Ufe맂*cDS$ͥI7ph1[!0g.۶*mϤTS?/ *Єmk"CɂP! '-kI26 C% pE¿j >+cYܞ3ғч;.˛'tnIc26ˠd ڀ,(n "%"VPFr=Ix}j֞`{I1 iUxx<OpzR;iB9D d9ֱ$(+qIyWD!>b mKJU@$(i8w$.h'B[".i$6#7HCSBzU/nѭy4/颜8?Ђ9'R6#2)(Q` 3NMTFy@*Txf4v&49nZ?&ߖst>qt7=UrZS/.ѳG?MtU&h.e4-:QFXJUiVsG0!e]' :~1B4(uZzJf}DEoߞpq]5u(b qk@ z-O3|eMvՒBM*@* UJȰeIcB6+2WEO">U6} qTTc)G j2]>aP0#/!Q It=@Mb Wq1[HlL{ʓO7왏;7]0>ẹ 'Ϩ%lgGSza4¹+n-gɡjDFt%$hDMB A0=o {fYYS5RQ'k6(':afOz 2{ҡ]Fp.. wh}OGۻy9N~jF'+mTIuƟͨogR 4<՗ΆM!RYAGDo|U;.tT _O".jw*I.]YqWmdU1`?G_k>FfC_";^C(gor%,wF"k((̤7hk/qX}ߏ&4 mQ몽Bߺ`0S1BYcD9Pi'*["z[!޺q{_Q=iw)HXȭ4XbTXeeJ15gr-o<oD+78{` ;L0PQFM~;NщF9O5A_?n"v?/ /ţTuG#TFNBTٺdfY[%$Vܢrǟ],"Z]-o5>=vJ%<(%`3I7Z ܨ9W LAvfw4*<A4sf ,ZK,N Kc-7єSyz=Pɯbod"F ֘N4UG>#4/~R_a~A!sq&qg!/Hm*,XsC#0FGsuv6:8'RؤJmg:(|+Exy0 AAdAH>i|CtC."D%RYa9ex52FQĬE05ȍ!]UCdS(}m#^EB!޺̶qх:9i\OvKD JT`܇^>L&iXRzo^whW0"(C`gBy%cjz$|X<#Sf4W0AUmRvSw  j;R/sn!2eu("4g=lhvgθS2 qS`Θ^9zF8,*3i0w#X`ʻcPB H,*õLf *`;zq p[H~(d䌥TI7gk?`c\3򁽁]ޟt<`v7_ן<h#D/vZ87_81`gXimtIcInI-OnCa *2XUɰ|5PBN:0e*3`"5[PnjW5x*yS3y3W}z 竟$aOF?Nę>N~ډ*;1Y\sn? 'o#媇d׏0἟ЖOS;*o~ܯY?-~K`)ta|-zOihb>`˾ZDj)rM6c͏uc9HW>y5ǧ~zǫP;t>R!2&cBdȂG#1ZkW,S*0+}jf{dgEPY`ޯ&u6|>0a~%LpتΟn}o)) z|o8!yʰmɖ<\ׄݹir?<4^th4j?5x}ƅFZH.V&+lWqROvP_ff~kV3c'>AVFΗ^A4ͦoamo.|Jh/p&j{GCݠw.%3gMV70ܧ"; Ke; #sd`\/ HgjA}OpP_`׾xk_0tK$i\.,``gA0v0')VbFPlD+Gql*RLR%4x%1&CJTL QS(6[ q{܃F_Sᐢ/ðj9n[m2mLbZĘ&.:,Daճn':F'2a<3 3!3%5;M ^SԴgwCZEXښuۍ0 j3Z[g 1*T2$3{D7"`lf@vz:tG@eRI=۫&ih[:3m;La: !CxC)U/7W3${tu:џ4Vj A Qw(Y=q#E ~TC4A\'MVkwܓQ[XT¨9JJzIī3O21l ~1jaxc7oD=N?5NhjGGiQņ3y_pFhgM+&hpHģΨ;щF"~v,E{"vy̲#{Q ;CL)i.lJQh2ܮ*kd'Z$UO>#]%K8Ъx?5<72d~—5m;߄D&ց|ꏋr#ව=o-ӚykĒe^q ̘\bqWh{j3}o9e;IM_=(FFپQ1nޫ$?~!yj;i~]7}ۇ\l۟{?Q  q {X,ɐq`Rm;LCOW6wr%& /)4(yx6ټ FZcꠇw>ذ셏87 /Nqs $^y4NeZeXYۨa&ʭgqp`D|{g - x,مBBYCk.$lt p,ש|σ\RHl*@0F GUeTn8p49~O:P͞[ FHbska(i/8% XMިΗYx[>T:S,!ATP3SB |BCGgBqe˜}l.8#}ns\-}Y8x.p7,0 2sdu D9P{g8xA5ؿjhB9;FzIAEǰPFYn ˊTsC3ϚWl6̂HB4qb KnWEuwfiv=bv@ <ʭXD{+Oxxs`*8(,3}&"5>dS94gfCF8%+Vh9lwxTf0u<+oj 7!e 8L e SkqfoFy6F,I3Ç8PN`bAN(c?r0raɴyNu=|6ͦ+:"Ag{Wi  vi4%Y ,ݜA6I[FҨۏqò%L($?UzhǠF2(VidGup׊x٪QMHAaKٺL,{p,SP[p#E3p6\jUzݚFfp^m.SjP7)/̥QʌUNLTbm*kYї_[UptGjͰu/E̚F^phIS\ .l\{ 0)Ș J،%nA3cӟ]A_+2R`f>uܮEύ<[+>u,$'XM,Z4ANs!u+ Ve ]/0K@fX(M.SdkZOFfpͪQ vHJ1-BsqJnS]LX2?aY%ܸT`@MOpwrZTkT,7޿pHs~@ degI4jGo<}\w__$k*S885Q %(II̓C#38$^KYǃSzhG%0oG @ _/]x^۪`k$&\b.è}'1EFU]aAh` ` I4C6 3Ñ@߭%WRd(0(Icl gI"+x<74}PwEਓTq0ez絛?-DŔ롑nvn!5*ؙH35N"Ghq3uQUD2#%oĆD 85ZQ&{hd8TfDE}1R T[Ap%KTVBTizhdG_C&kOJPCWgFH%*I& Uq쪑sਪ)|iQ*TD?JC#38L& :$.~$*I E mLI1 =`B^H)@Ҝ \=n+YaeY`ݠk(/kB WY1e|R`Ӭv=n]L苡3!OmJrܓfg6cce.C3/~UBLCO(2I$.G~; !2yPI Qˢw JL}c w /Id.zhgrDF5顑s ʈr NU bz*Uqg I#,=&Y=ZALNC-N y`i# oB)ҤH{*EAce>3* "I?1-sQo\d(~@IT| De N"?䇬 EIq5gI5c` j#'UY3^aY N۫CãNQù3/dZqᣨ#RR4{Rҟ  UK枷M Dǐό >ErITx%\^k,H%x:+0Y3e]R l] N1xpI˜Es ճoCF4顑1~jT=֑x#KzrHxfDYeҽ]y:lI2QJp.7٪yQu_%jG)p Ιz㼵`Jm 1JS*` YklHP3dv_USK͌:EN,+X[EQ2 x]u=1],R<$5:0~P4&M#SwBOqvQ cx)_ܤ<6o&IF%[G6L'N}֥B@r0g,30Ŷ{jh'7n[ *wq)pC#38 'Yފ?:3 Bw~ćz̓geB$ ӁFnpNX>!B'18dL[!WҫM+ ' 5*3*c(yY'lM#38R,1mPQnOЯsbNoAdu/QI՝|v$_~> )QHBrP$)b#v\>\e(b`ö1!:p0'ӜX2xM#38dp ,CIEELܬ^~\tI gF$A=q9irPB)hMM̈=51CxQ $,Y 23!׵1VГO4a 6ldh1;^Y^DeP "3ۥ އu׳@fXRku#N#8Mۛ⽞K H?jYXݺ;s&|fzW"nޜo׿wrq}.7А6o \~o .t[_m/g;;$|޽s3/yG%9LܪKQFLMNqh\Խc/Ka $/Vq:?8m?e|- Ƀ}_'c`O݌NhRϚwE"ޭ_x9|zڸ퇃Qz*po헊 0lXy^i|?-|uS_}<'?v]mc|{K7guȺ۷f-=qن7҄s8LIWW::ʒɩ:c0*+HRJ|l?k=ns{߃^Үv=w?oJt'Imf:ݻ\E ~Zma\1Cr,t'N {VO@d7Aүonɳz3ÿKu>pfof]ϾmOx~oE1Kgϋ_*92Elg]* =i?m $OQul8/:oZm۷/>WVoanAoگqل,E[/_=u&8@a&tJdj fFeʤk16qH2_7lf8=IQTҌd);UqyC#382}LҵC;{h'qGUC(qk>6u;Ep煉f@kT׎$2CSMN٠鸙q'NUZpԭ9bFqJ,?? VatqRcPȐ5$w15++U dg( ⍩0Ew$ҷ`{yoRC<@#C{vbkXMe}%U[`kF>2!s) t2D2˚=42S}]OC#38 :7>vs?3(:o7tjC#38.F"/FB*6VI<8͏ -* , Ux4?ʝRC6ɢ Y>hntƮi8"RV 6^A*A'r۸D5|cTu9B)O1sJYM'.ʓ/G?gRw:4j%Ky+>A 0U渀sm,@}ެG}d/o9Gz$XlzLrL6ר,\eS\Сbhf T3\s<1ihnk/[Ą8;4l'=pوߟ"YsShꃾd twbw8/.ΦV0欙 {?@k%4s d!pʨdNT92҅p6B+--ItT*})'u+U {Uk^:֢l~Dw-yxKX-)];`l;:H'hGp;NE\E\tvB(3s/.<ѺN9FJDP,! ec ?;ME:Qg!b7l;jX\cf`P^j+Ok'cGY,wK(8#XɛZw)wY=Vɳq&JEAQ%RP.,ЯV׫IOeB`B*>Nx*xb_DѾ<_^,'g.dxuMPS[o~YA^G#.oZ>(VCh+MţvJաR2U)׿#cFެ_Xy.?lϋ'z`o fgFzEޡnWTXQ_bY֮0mQ#/QUJZ8QpjokB@=4<$(tow,Ё KUeS):q(!zDaڮOTOuZW#V. Y$j^^^.N~V>u )a!u뿲ķዀ|UҤ@i "Jݕ\;wxHCi)RHdkyxfgiY)G6&qkzqqE; S/|;=:~u~=k t}u}~Ih o4]ifFy,syD7RUسT2G+Ex+Wi@C]!ozKC}wzֈhz`BZ~u&s#.f]zgeM{kR4+^+Vh, ۘ@}wr/ z&E"%a"P frlwm~!spK:MdDMSզTzeLF q914%#kfd_/aB[e[Ło௞T9܊-Ʈe=,Mz{kv8OF믎ҧ2ƳZq s>M/x8AoO$=ZSДT0^Xy7™*)Nϒ+AcN/?t.q.+ʜ11 MY*(@_;8`1/TVUB>eG Nj_^s7o]9LDŒ*[._! x ?\zAR6_n|A%7f!F2A*?,xݯ|M}k$4z{=BB b,sAb-#FZ|4Y6_# x+[4W>jrHfs+LK;%V-1]ǘp3f_ab3>.O2 !|lsvR=`&J {riXtb.oy-7\LhL8k2ѥ$S;T>ZV!`(/l) 6uRtzͬ/UfbZ%56\wI Nft!XE$bZ*>h_;B}&5T`7FViH/W P҂ g`)E  O8A>OA۟jj2$ut,Z|2f2G!s hv[H,Ku0'7} ,b2=&H 6wb4t!dcsgZ$;+O#;./VHNY6I3UڤbHA)Fz|*|;rpHG̠CR6BP(]ңZ D8rY:gfW;G̒ŵ8&TszɻD&my\CqgqxnQke|bTK]BѥE\ON/cD}|u66ބ .& fk`khwI%KvtGwvÿί]>=0*>ws>'lüg]l/T5\L~~pUמɷm-$ln|[3|{3mm*цc\t|kr.&g=[9>_gtwKZ݆q$Vg%_ZZHiP}t1 E4 %CB]SsB{qWhI󷹰d~yƗP~8߽^cܼ;w?~}aqGklXuoG\nBI,}pQIBM{)$x >?*jZX-;/oDVMc{˦5M떨jP.ЖW ߖjNQ:"ݞ-9g^&nK휐>, S M Ȁ.[&h/穽aέk ,VM߰(R~0[2#[Ǚw; \>pU&q>IiDL\xoYB69 CrB%HdXi*3KWח{o3Ni_i:O^6L}+kjҎ!WG~D:jIxk[K $F/ ?>':1/!2y}F1"NF@|eAjo>D_ـzp951r(Ėm@"TI= -v_|VxI.)5a`uK/ YKI(*qCk$hR*mQT91**BQl:7f$R^&$4ZN 쵭iF $5WޛC)9KhRΛO•ؙx@[\!7gT5%;|)12fDP |9OSih<mߜg.< h4:# U00HhhC.H"uyjxT_t6Z猷q %gW"E#P>TFB>S9!AɆ$ʁo) S +qHh^*TtČć**oFBV7HQ|9Eq{L| &k_p|uh )m\9'3VN@,:ZsPtBE'sJ;2)l\`DH HkrL$"XPC%ϘjhSJ"S2Z͉p4| X#xAm!ӢΔTcib8.IPgJ&m|.ý$XAdl3aރKC!Zy&3+&;b9TRl7 -C?{#[|LG{s =9hs$rqk$GЉM:DX&*C :Qhur|Э8ld|o'RvW))q.3؀Hhpzsi 5RlB L m3'g"rpDz %ZtH(sADk$4odsJ0=2-RL @,@ K*-m>D)y'R7W,qCj$ϩj=T@IriYqkF{2qFB %s|k tMp.J9)B  $`R049ޏH7ʼne6K,.,wy|^J%~VKLbyK~IwZxQo8 ~*v.c`727pZ,^~.g&QE$+ k sP^ -]8))I|C8qYX9u7 -4z/DM(#QIʄI! C MKI'ŬuW"I hs٢k"Hhhz)#X4E8#cs2[j$oU`]8:ii Kc4W#xF7TQD*gN!Cq>۩* Eli֔=4_FLs"|#qC"Ajw03ݟlA/TЅ?Z<́6W-Η IcSm$a $-?Ŷ>HRǡk\oZ,Y%xc ;^Wt8MO4>OR> J\ qAcҏq]%lc<~?^5}Mo !'~Đ@--)'}dEq»aHρ%kR β$2&,2ZE؆%ոrNŶI78̱0 iM:qk -*fkID%GJƗWEfĝa)gЀE N71ZwWzieJ|\ $ρ ƅu) od*ˬi`3dV|R/hϜm4)~| G`!xO^g.U.ZmHlf)s8k2G)l2&=nrߥ ;IINYz-я&q-R;3z_gw X; i=GTtl{ ޠ}ot7; DB%Wpk沔y)He*hFp%0dPa5(?yq2R=~Y 0b"3yĤTbfWeyRcni} 0l:VǏX?VǏ:~GO1B泌*@g- hCQx r|,f2@^W*k'Ά^uz+GJTŻd+<=}q`IdVt?wcX>/7hH(nݾiz9֞1JHduU%܇j,lT*$-;uҧeI .gPn 32ELP-=U5P08u,ehaYRQʝp4{e-MgNH.LG+` ?d R3|xyIͶn] ĴD)jMK0`WfEAYySMCxg9W9PI 3t :+"#Rr "CIa24F+ Z pK˄N),T)"&f07"vlDMÈ$AIw怒SMSw ou uz6 窃CotSRȸ4H*W3]b90WU@C./Y1qZg4_Mi!̊7#'VKgxVGiɀ5͆Ӯi0m\\-2SW^&k+N׃,O^NofVí$HZL:4pCY]qU%(s3+>C0 zHt]ԃQ*\[  X oo爞.aGP独ZVL@1]@(sH́|MLB2IV=ɨ C= El*&ޕ qb@_կ?}}uz~q:){_f MK @,]"0f! RztW A۳80YJSA![W`@ׅ%-B?{|Gk 45-\uE rk~ =km!B Bԫ ẼVuɍV.ݔ7Q2M) BRsS9A&U^sTIyrʙ3f0k~.*j*:5b.^ʷ֮N56=2p9,S>^rB@q2C8F2 pRΈg1*JTff7#ɷ55XQc }5Ҿn-'{v݊Cc3W[" +%K)Bµ IH;T4 HbSx"65M5,65M͢վLŦfQۊMbSQŦfYlj} SJ,ꄱYljŦfYlj} ,uYljŦfYlj}~ߍ:_U8mꍃ >t8h'=QkڑfRZRxƈgxƈgxƈgx~ARk#QO#T kD}/CMX;$C/kZUY]֙M6R̴sđvc")wxTJW -ZEUG) azTQU'R8TMyUy(BBuKi'$sۨJlַp\p)yrVqf\/ό6Jx. <慒!j~/_?vnJohaZkYC\Q*H + 2Q ftV[qk VgræZMMItBkZ욥C7 [o6n7ri1JHdKhaL0Zm(s uxaG]^'X?oj$M}x?FpG IZQ$%軜 g@V 1Aîv5mҞQ/ϱ7ѽniS*:LMw^R2Ӗp&3O=|)|R,p+{T3qEuǬzRsa=2΂ꋐT(궚.8mFd́ VG&Rsmқ SY&8'H; sk2 )VRIpj"#H c}癥Zsp3oUsb$!T6P=w +=Mǩq/nQ,L!dc}Ľ@ŷZ 0t#eXΓ'rSrgfc88UUw]4)p Q. S2xaD LB#LQbjtꮳmw:xuCݩÉ8!\IjZL"檪{th0}b3~IO&#Iؑ_uzVZd\fi4y GZ].^.l}8>jF&/Uo'_'7gddO?z'Q77"C/P;t CnHu`H=kuOp%`vT|uP֊ ws 9w CXHF> ª8?_]AĻrA^>N`?^\z}8N/0QWgߞ]]>+ $ݺ)Pv~i>Kf,dsZJ/77Nsj05!t;o(_B(b)MV80rnAW_\6TjZ3[Ma-o*ls4QK&\8Q2b 9R.(JxC$&+}q}tL*ԃI$xوiJYf F5:L12`9TQzͽʂ&)Fc{ҥSP]SP&],TK7Uڍߩmvv`P.3ǘeʧK[H10RfB"NZq\,uZ Lf|$^d5Xs4n@VTNgW3 T&@]o\YުC$ܳh. 4(3m,:$2Ыbғ/-)L/(/bZphwWI> 6.f6q'N 8kRdt~0 x$L3A<p:!z|hńTy!T"-yG3:k'4bj?7~er8JoC9jg-b??X% >  M,tF@iMq8Rfpc"Ղ OYr}/R.’ Mo&d&X?Y#[N^眼Bivdb+mDsrmˤlM|-7Y`t"ރ:!1H~b'7ٷ &}q+t8 & ^:=XqI /m>qV$T64eغhjbE6Tw_#^^: ;"bvk_⨽Sx&ؽw\֨aV]b8l>mrJȕ?cixW\,.ȹ#{tJ1_͊ӺDtMUOFYǤE$6eanj|{E<ӏZRm/t t׍Jc.K>wt pף4ιoZX ~{[Mh< |%nlh]Mh keeRV!ؽpw]mMѹyWtl0?[WZ1ԿI&YurzrT/sFV0+c(w" c=1 %蓯`b8_Y\n2-! *-p67x U%͓JMI 0ﭼSQQHL,~0PiRm8n&Lf~ &vmN-bEAUţ0Hf]2r+ܴ|gAYUهZg|yS665zB޹FzN_t"yOO~?r旇O]?{Wȍ/{m|+ȇ9',L 0ReG'YbKe[eeYjUSŧ8з4`C"ɴ25ҐJ󫵍R 74-h$5zͭ28S嬣eI-E#2A/L(L!N[<tf|)fL 3^ıwg't8yYQt_`zd`#MFl =al?TA`}nmXn^mǶ6$Po'PϹea"mQ|%^Ealg >e :\uos]dPBJQ:eP |YeD"Ykc!'6LfQQ3QmAsv81|= ڇvދ2A{o_nRs5)=x-n [{p\Z t)cT!y/w/>{]3t8sJZWl0Zx^YD&% L%&Uy;Q9z{#6Ѵ4ȃb|D:Ԗ {Yo}#}8*w+Gu`D 7(utM 97Q홵h%}VcrF®8GKnʪz| O1ZDF gۣ> 1wAiny2Q?YLiLf֢\S}r,޹Rcl$aɮ . P&d験t茎ř_8s+5c<6~M$M5?g7;Nǿ{̅=>?/BV ճX`m͜ӸG+Q?ߍҷX>]`'P"gӥJ~h>OpHXXf<(UUҪM6]z. !ioۥeO,ڍgboVmzp]Ͻ:pz +u^9U H&^ckMF*dqR [5#I/m]V}>F|ӥ?s uN}.t_fW݈X;{6=| 02!s6@PrYT8CF$(`s:lLpzyL]XGCg2q6v3,o٭YC33 k~+8:)a)b}((3t!2UN____ĕT]*TKU+***********i%[Ք5lM_Sה5eM_SWH H%WP)k)k)߂}å5jObZg!=7j8X5ɀWFX+Yha>7Ajռ m!_q#fX&:T `1 F&.go ڞA72P.@dvL3!A xzA7[|W}V/ |KMc0 (ۉV ! >UQxf;,cVHՑ3{W}"|q2 >uƦKYBbF!9!8cd4:s(;E]Ȍ4s>iO3=g9!ZˀGzYA$ BسYw]Q/·+!HQyFeϳl1$(IrV "<Ì8w!HՔף)PDT D@ZGc i2](Sc>* BBYPoX`:"`HW>x lLXĬ&{3$n|pAKzZs||:fD(&BxY2w?]@Blt:,hb.9V{E:6YtE~;е{` Z |B$i'8mL,u $IJd24;ݐB9Eۿocw -ݏFխLtk+5>,hyHkGH 4l Mbp2"[T&em& 0PH.*ڪ!D !)-7]:ZB[4B-3`fh#1(,E΢0l ;F85-F&-X];DŽZd_8;=8Vt#td8cC2198jucv^ MT>%AFC7 *\ Կ Q`cU>n(ZBb`LQ;2N c!Y FT=QxJpBbפr,HvDm95,_kؖ+|zzT:K^{Q鈴Qy[7x۱.#=&{_Og&Qz84\4\(6 3߾5npgJ+.㞏fV$FzÏ=?|ol'_dzyZ/).Z: C‰qqL)Dd6 yZ53|O[gjK a!,5Jd!,5jJ aRCXjK aVCXj˫'#ɏpMSV atRCXjK a!,5"- -I5RCXjK a!,{МRQR a!,5RCX"S*[ [ॼ_/x1sc:~mَka;~=Ua< Q;IvW;hN}\ DS8vCQ 2sٰ:=Y}oh"~ ]!MRR0gtOku1 'ՑEdRd0\mb)\坣7„ݮ>X-˗׃8:Y8 dXȦ<}n59XC!Ep\50%{x` 7eqc.DѲ;뾴V^<yzL;>d qDO dJSYV4,] =cLdYXoi<ǧQ[ƃs}x?{Ow~]vi7_F|cQ`I?nr5xu;t%h5(1G#_ݛ]^\gyvĻe`]oGWIiCȞMIf0Nv>z:mn$R!+U7I("K"%ׇ( {qc2 Br2hF޷;KvDx2etތ[{B}#_?y0{kaƓF>]LZ|2>՘mS#Qo$\:K40dҫA,TX%aup5i/{/P͗ _ۗו]r旿~/_~x뷘:}qQ(y >5K61sFn)MӇ/h֖=_N ]E\ISFI'F{Η ۲ؐt:>%x?=H"F42\ Fu:ihXP`aQyT`Yykb$=a!]}8D6tK@1E} ZMRr#B P$"HO9#h˥NKYp#&2ѐ&9`7Cmm )\<W^vJ^X;)/%cK^ɸPVW 3hJPGR 94$rF:\"ͼNֆ:BA!+ NDl04QRobu%ZB(kB %5ӨN$Nh˓W8^8 R@5DT]|k{ܿyCJ|<|sT<J¬H*2C]iWBu:WxH¸l]}?[{RwGUcTר`/' VP`]$9{ꤺ1nr^\v;]I#׷ڴ+l&-lBL'}lk5睢kejlid[+A.o8_^kP-$08k{ >!l1֟5ggSRNSn:h񐚮u+1.|U|{Y "n/a'|n(h.욾NS-7n\T[$%?;[~wYBWf)UKJFTO'ڙv~+^eօr{Wӟ7d]$5{Ӣy<\irxt!jेb[eyR3u|}|綪yid]#3F& qldvp:j,mna-ޥf2Rc3BL~ AT!hE!4V }A3_7䰨jb+2 M8fz#%3]c屶\Y QO-2p.b+ra ZS4z-:!=i N6nmhcwC;2/9`Z`:@0(S˴pƷɉ||׎Y w(ڼ \ ލ;zK6@͋fLMSoN-ڎSP\}u [&rf6Y3&d2 ?+VN.G}0l4oh G'X "~\on[CzcfN>//y7 `B{\6r&p)-\Ӻ|=o66 pjc6WOmՄSt VOųxGMMDeՆ ̿[CZ1< rmmҳYwT?8w|X Ѓ)b5ݍGisQU_dSm"2R~B?v=sPbχ;,--'{'k,ym!K!8m.Ny]9e1P#om?S f hm6kCa8SJKAR$, KAR$,UJA’#] KAR$, KAR$, KAR$, KAR$, KAR$, KAR$-H(XKS/s\Ӡ[qMY8ԇ@;G 5.f`[CCH-CvΡ:bs6#m-XnAF#zrಀ-DJa^iGjCM3) H{1UX~0V^\vkD:HZ=_އNTK@J);[LqTUnWZ۳uxi_9vHqV1}zs _, Wn:j>j:ΖՁbAmMQ:o(EY > AB)6x$\7x˗Ϫ?"l`?{l?)HanwR~Z̀B䌂"lt͂(ῂ]]'DxmD[v 8ɥw(P+iT c"QhvJk"óðd =!"} dh ƒK)-9 Q Ƒl'Rַ#1 |Q5NA6'48nR8)raes7#-ZK4'H W8lD ^{(pU,̛y);h<^OG6caFh|]C̯5]{2;<ܫxz3e f9VqyͼE՜#U2se7kf.el}HE^3ښJiҜ @Հsseϡ=%䐤XLc1!(8v)&8؎Opfm:"Lbr6r_4ߍ}m;&Tf( ˨FL+EOkN7!H,ef 8$@ΑوD 6Xĩ*0n}P>sr8. %*VѸ[iUD&^E"JFc&=j!!g5_j>cR Gg ']cm_[B$q'-C p(\Tڒ캰W9t't6FvFIC|R7 ѻ,1xLLGK!3jg>OLjSĂ{aJ'*'&='T?=#DŽd0Qhf8Ex4|zzk;Ff9} Ӛ KnӢ&Jњdkk,QV8d.n\4f=:%%OAKZi2x!/DM 7:ŗJqT&[GEdQ0A[R.k%C0ܩ٦*eVn]x=ʼؐffؗTh|Xz%XϞW4A-MPK4A-MPKe(MPKB jiZ& jiZ& jiZ& jiZ& jiZ& jiZW4A-ܥ jiZJC& jiZ Q+F *UJo-FKo-FK'`p)nļ>Sz+G-H5]; G !HIo];?qpDuhO}*9Gx.iw\ {bgJ(nDE\07@ZD*p`w,ZA۠FH$i ؉3ˌH3#rMg3K2Wd`0/8]\:\;Ā=4ёmTeMJnɡ)0)I6*Ƅs {8@X Ga6a*ȍqssn=8ٱWiEjԄRALJ@a/ oc2.::Q9 JB!DHD"#gKP9bHETB@43e@>;@O[p- _Jh4#-qi$< Ұk-x9Ņ'X v#qa,\P80MQ@sBɥ^8SQ8c+t$ j˳b';_0C!)(&+n#3gD0Zs;RX1Epd_&H~E?"!j`)PSPrDJ*#aHsm}[K׭pгCal$-̷L4 V+ GOJ)$`"TсAC!1&tUa|c\|9F!rkt~MےKǶӳڋy?r O %dO&RNWu {>"C͂5 Vs,&r[:$\H ;VBRk&5u\[uQ<2SNhuj)"b2MVP0T&76a~R[5A)"lE-&x3lٌS_.&g#@hŎ V\e(ɢQ4Q,^=UԊ]a O)*MR8p8%,X@  F+" ? !qX #;ZPWKi) ;j2qf%VHT )d4&jci|=,C3)l=Y$IES?m_[B$q'-C p(9i, oDtF(j)(ՄH sSƃ}yOx|]ʞ~[*r?kO8%3x[*PHV'hJrvTLȩPnʱ[*O~=Pl) "碦~4"6}sfFӭvMӝxqfmq6|8ʶ)+I*t1p n:Ɍ. Ry0.jbq:њFאָr6,ܺyY/o|z`\9𮾽1o2m}.C4 ^}Κ^4fgq#`5j|ݰY동ymEck .> 8пA2_s񿪱x3 8u n_ue-3 חE}7rc.J%p4| "Yb9G9b|/5-,&M/_CޥnjadbS͙$xU=,ʲ^fM:3{WZ?5| [(kb֍|Bm=B\Hx0*{h81rл~)_ x<< BphMR'6.Z E)EnW fm:IȅUwϹIzj%Q4m5EΆ)CVRNGRra:rAaA! p4ZzQgBN~ #f~ƚ6 CIT\e5GJ5x c~DGǷ_'qU.v&dbVfN7us5vMuG봳Dvՙ@>xЦh+uVϧ_An**iӲu>IV IGғ=~F^6fs\7T)4xyFAˈ%+A*"^\Fҡ 3r%r)O;W?0 6 #0I9G 9PٻFn$m{~$8,,!QqFI<߯zmemKgFbUůȏUAHD``2A)БK@NM_L'^Ilt#/2rWUԆWR/UR/UR/UR$RKTտ***********^d%Y:ĐnC8cVu>훯/쿞] ?}d<up;d@9P( Q1>H /e@%g\0p>xcEjZaEjZa l&T UL< ry+r>znbAC@s4 f>F`SNfi]ەܐWR} `Nq"G|5^:'ӯGsJoNپC.գ#NFsKi}9=~$4eɄK"B QʦR-;üvOkGӷk>L;GFC*5TO~5~'J ;\B}E'#nM { ZQiد&Z~bKrw>1ΤNĬX_}b+/Fn ]})?hT.{җه_tb}T o,!hȸf3Vw/WXkg,yvc+\p7pfW:VzŲSq[8>҂'#zE{dd{fSZ(LJg]՗;=B ҉tN7Q:(-KO2Xkg']ED.E_!nzg[(c"Ýץ.Ƭsgɷ+~s^4p|=zĜ#QaD` ُϩ֋* J'AYLCX&isM?j/F^Q/F&Ff7BEjD &:A0l6#F:ݼ6E_28PBKk pZ)CTQg)po6ģSnpQ^Al>9wJh5My, כ;׏%Bl:{5 %١v ' Ée=޸C޷+7w-g@seW(# 8ԘԆizfIR1eVĠH;n8%E|H8c<$Ol1uP9=yesX^T{)p*ŸJW+՝݁DT+O)ig;ީ$[`vx(ͽK7RZ iQKBB%B 2i~F/BzFTµ Dj3gx%DH@*:8.:8n:86:8> |ʦ!ph&0L0R', =g .((PXbPGnQy "$y Cb tJ@-ezg&p/󿋽ssBcEG/HrgÖsL0L(@|( ~Z> #LZtID+9o9?'yJ7\}ǜ߮|Ր837(x5_`eꗅՑIcJ%C$)"F;H >d֨IJup&&lO-} 9vZCMϦC4Ҧcx!ӻir49rw9MλnEar1n(zXDp!Rr'I FpM-um6CF% >%͏-+q>Ut 56:' bR\(zplb>g/?@;hcPŻPŻn@ۜ/GQXA`c@E>F2tTZ'kQGqҎC2fgMzʦ^ jZW䛉wayX^.eyq&LGp8jS2 @M$*\"DP{Oԯ hf|sη 3 E2KxT&o lQ\8VgMrLGwh)WBWEmG~VNV6x5m]) e6d_$ñ?~ݛ@7ox@lmɺ8ܰ9/l4m{?ų2~!/Gudݽxƻ m~|QŮ(!jqRrUODy% ҫX:ǭbؾ_U{gQ6zOCtw b7) q_3l)6sjw4oz|5kܘp/.3 ɭ]Ghs!4W.MV?vB{QFw77Mw4gwl;sY~fA˷7n|hsv}W/+ݾ>:~;V6h|Gw_u[2Ts[\8 |{s.Uő"#soB?tps{uiaQY+[z#σ7r0BT`߼}!9wdٰ1^Lg\:^|pl<Haft9z 0Hyvzr00i yBKJxɵ׍(o ,(Tn,Z'("V #/ ?PM҅nB IT\m5N js#!ÌN% $#[ӀOV10jcdc9O>(cAsQ6S"6=@bQȠ 6(T;"@ș*{ [{M_ m-Au%Vbk\ ~"U*5=̚ Ij0NMR L]iRd}q͔YHo=0\Wcn{ps3BEF>Nn~u|vdLw=l+z!%hɻ_G7__oݛo&ǻ$[/^^)PbrS0ټZf84U@J6 zM*j msI<&=WG&8Ds+wr'*wr'*wr' zNTDʝ;Q;Q*wr'^uk$v!0,6npyOxT*^N3{|k+mH/Y~{~3 !H5)X!DIRRgUʸOƷHhE5~܏Qs?jGHq XkZqVkZqVkZqVkZqz)B-{+(b`6"kaUp7ۚcc{0<;+gNGPh7Kx\%ns65R#h8<~>#)7>_O\t{xш&5Hq-vJt!g+(y9/+wsі2"hY%JZmJڋ-Eԥ~}ugO;a:a0<;f5>4ogё|>#}xqdv:i9jY~_Pd\Wtyή};_fL Cz|& 2; =%ԭJ!NءDmgh{hB\n+xSzvM9!^'Rs֭n*&J$vH,n>KJyux鯒bfJR;Pa`Fj M!a`B-\l~[\.% vAٗb(0L:MqsF 5;:$ST'PDx-[hϟSmAk rxz,מ.>4(`)$HK'Ry'SCc*$R R7kmU L+`ZKІit҆ 'Bkt.  (.hX;cqZcw0AgZtºGRDDXo{J M"$ ă!V ?/1J:O`JJF, ܃ @< vs[ԅ*HeiU{i$Rwr &ͿƹPa?fj|2HpDΛI8outD?t?& 7#Har8n S]-GiJ=q~8]L%vpNNpc8/ vvm-9K!QA)|I@58~7NI!L+SaAeq>uuj~y:::. ԇ s8@"ХW/6?/N|̹+5͵7Loωkʏ݅٭(ɜϽAN?NW۫] 5IW,!FGʑF:[5 [=+Yf(N? UV|8/jYor˙/_rըޕ2Q9o>B~4fmĢ~%P5^񦕌r:}pɁZZJ,mPS23$hb=ᑧ ?bmҵrHvc}~=ȍ#VL3_o].f˶ְ YK~ w>6n]F.jDeV8[>"ys-5wv$.QF1jl[R&SkpAB+ɬuѰH%ob;-IKd0-f˛gkS;+ U0T&Ff&K+.%՞lH $[[>rO_`/o Qx9si u/dIȂ$>wݐJIR$EvyiXR6`և$5TzperАHkaڨ5gR əD<՞J1p.-6rQ'*N0 .޼g;Pծy4`x9b btt^{"с GZ/,DY)m=U7<2;kcI(E5)q^xKY[K~n؏;DaV=mB[k$69IٳE۬ת?\_v .oÍ **9QTըyE5Jg.|D=<:8ZQ~Z-JS:(G#ZRTUwJR'ȻТw@eyT mJL $NrT p4M@sÍ~2_{чJ+hݵ: Uغ=K]&1^1*R**ku3`ep!? Ң\ ;+wpV]\{ +x/ɫ߼W^?b'˰A Jz1/0]m S3noL_h\r+\h7:.WVGN{}'_[j;B.=O'~=쉟,{Qof# H΍FjLް$tm DaTDL=]'CnUѩM:jD4/Pƙ"w|7_] ^x𧳹Q;aҝl"2? ~xL+mDATG})c;͇ &j5[LJ_7it(!̢o\1/n~\/qg~vM ϚQv.]60l4{NԎf?[iᎶ͛{j7XJꜯel4e%;5J㼐p[+KtG}t0NOO L\'avJxɵ׭([o=Qٻ6#WCuV38όV("A8@,\T+BnԕUfeU} ƹϘp3XC”\HX:oPSf!/@ w1DecC{5Z%* 0.tE7lt@4. E}z49DI'FhŚ;FuwKo Ê7yU ל+-IVd  B(WPTds. ZS@# RwϹI)*(hI͕jNm.HAJWG_. ,(Tn,"G{Q"*B&&v MABWiW0F4K%rz%K$WYU;UK10!tMeZU>W~ͽJp=+Wڽd\eFK0+<OmisWYODid,J|HPEAnHN׉uz4t@ 1P o@Y"1Z(OL+͆ܯnm:e`uHAӽ:jy9p7FwCM펢疸{ѷwӺL|>gi|hVw.;CۋΥ~6 l )yI ߮߮foo/s?ޡƛ!zl/ot{]t~ީغy; Y#_Fl ,Usū]ZhZ}wό^Yg3W~~$q?+ |taTYS_+8b 8}Ӌ4Tyq=__fr ' .en 5}z7ˬ[&FANK$VЌl|x8\Ae*Okd24@*AkML(ǭfշ|\!=.O)=1!ML21YϢhRNCTTqG.! `]ѵarlHqkLjjjlÓd>qtP+5Q' ª M9`mA}CKRiTƔ teϴ@, 3AE$# قvEhXz{C@&)ZFE5Lh4 J`%h#GZ}"PW RUR )M|r80S/9QKǫ_F*k j?0i΀S:1 &Gt^Oh.3)Iw<4q[KJ{v+[~=E47=nN7TgАH1S,Qk$B3%Cx=b,9A](u7 \_R}~Xp*Z{Z-`l4{\.kGdy<`{Aq`PlC1QajQ֫ɦtWeRKC{Yږ2/=-"Yfx?ӽPcIluGtrC[fZx&Gs}ۅIeӅlM4qM[ jN8N,- wb7KGb&35S e8r:,xu:7w{_/#z<7EMP@l; 89ay(kO^O?=Iܟ?q8eI:Ȃo(M!U%UYBBJu`BXzeY S>8ΓElHL:qsF pO<1 I":'6.x秔Q;-bY)=}VVg-S&:H4Q9o#B~j9~hLZQJ['|Wn10i-y BEI*H 9M*PpP\fhXhs8˃|W0-ގO4YEVHb"FbqSh%Bk,A\DY5 QWyT)x-WR2bUG+bѠ8:B@jFU(()Z)^"EHьiSa(\Ɓ&HI)_`:2ar_x_N/Jߚ\P EWiϻwc]Us aRN6n0?t8q">֖95% jp)CSQB9awJ+)n2?v1xs#[͟-EhRHz!BDR 4r ͍p S_}:in>RuK wF7Y"Zo^^={w9l~TB|"ty +\c^Httcי_ӣ{Ƅ8Xb̞֡T?x0=i"1l̍:z6>l#H4ߎ03m}vЎ`'AH[M&!7tV kj3b8p(f+~;}[ pssluɶZJZ'ycKT?އ\4N 4ȓ%ÿzZhC#P'~6O?~ϟ_D>|hq#0 E[xuѽ?l\-~NBoaARVlka6+~Tz <<|{wQjoQ57KպdjС^IJv?׋f- 7B 7b&y]7:37-&cމkɧa}u}K;< ״* N{ Ef%G]``j hC$~+?; EH2$kjA*PD (cHdlb$ڔ:V1W+LG:_'*HTY$R!:gHpNVFJxRƠa͇VqRt#tЍM5KzluEt \VRF19]P%F{W2j:mwmw9fOvu}ȷ|[W+Ocu>lM]#z3MM*7qJ_t"XU1cx%wU$OwmY]{! X$&l|`V,Rr,)DlRn0A"luߺsnH랂fgf^WEcjx1eT㝗iKΕbVI(Xvʂ5{Ri!iJ96+IS"NJ$wΥbOYc!y7q~B_#Qށ#mltڲcΥ0:y'vy! fnOg+JeE1(S z c#SO K7,b; E5f d&Dv_cySeXB HZڤɖ*JxONe RĖԈªFTrҪ! z0BNM?boqwտ}ý ƺ&9zxn 3WྀW3:FheJའک BGl\sv }%,V;6l_r` Bwy!~jaɓAڛ_yv9D0nkz_ȲA*e>8:P=z{؃E[Q4:34-Ur2 6/U,=ZHO]rnfv}Ms{Pp}nyTsޏ}w9O^r.oxd(+SpXZSڤ\-A&Dw#wtzUZ l;Z7'^p|A@){!R&Af.#NG{b ԎH-=Eν$2Z囲ā~Q ew GmAky8 }Vw~s.Cۻ1@6Zm,Ql3&R뿕_ %8߳OFg#>u[ӳOʵv|Q+$\K!0rބk7).zCo}'Cvn|ԪW<`]Zm[-^3ӶQ}kR3+*mJ_e|=9;>jo7CՓe%Hڱ!/n}wjU33!LsuԸNѲ'Zra~FbuERͿ?.g_K½ggA"j@҃r b x9S`o^ݏVsA9NZYC]O9$dFB'<(Nhx}O}ga&t&8 +h]V8=FWMZ@?MG0qfxcb_'觲JE0FIyN]ktQnol9m JAgv}/v|/^VH3G!7jyi.!_Owx,XgRiZdn+ 7q{[mA?P$͙]IǸ[(Uzl0v @_3l$O@oF痗۪+J&Em]E/o]6Fgo~^`n3ZZ;Zp߹M6gx軟~-ʡZs49o}߼%G׳y?`lMAj+ Y_Xku35'Ed_ΡT̆ %_./SLg$$3?M?y߸?IZ-蠴tuU>\.n:+o]9/$&wH)yQ7Yx>_|HTr_‫Ԍo| }h)f6&u<1۶,KFOox%XH8뫳ewW߬EW|D;}?Vn{}$.Q7.+E`wpis ]405ߌ~;h#F&Oz/:tuN:ݾ%w;i ko.oQJ'ćMrj:njQ.LKiL[]6hE؈4_O2v{זյz0vTGtt/HGHEqF]82 q^!C?/R0f/y{;Gfͫ?}w%(rեbWKۼˬ4ѓѢ5FQOqA$&|mO^uE>J$H?x}w]2٧JB?fՏMK$R0쩸-&+UMdoIWd6;W1Dh}*!i-BmcV)kպ,ZMH}l']%+ArB92}JkK C.:Kɰ9^0mfj ^بLPm;WC6~ZSJH),eU@d:IqH E=b!{!a3X2 Y9ƒI՚#IJDq06[>{*!dbiնaMJÛTwT0!b*04uȵ:[65$`~%\=[oH "к@a/<=2i"r{YT5FZGxyD1q1OhN!XȆbAJҊ+C Zf<' JI 2ڢjD5Iz29Ow}>%de=)01!XDDd/ E\L.L {SXē4RZNI;'tHJTa+팬j}Fƌ6Z/&Na ЫfU(TGoEW Kn{dBR`Q/# .aL٧d#j,aP/mF[ XkshSQ3t#NJжW7p@(@yU6JcQ1(^\`V!(Na[~}֠\cm1U$\Qs(-i B *.aq_$ElV y :N] ZoeXy劵& W])4Ȁg:cp4k Vn" f*y' c@JA; B 9 O28]\6oз5ܠMNzDe]ӂw!T$MuY C%Tԝ%@(2YധhR0Rn dPgP j-*,(t_C%(|"H&jZ&T^eVJôJ绅% |Oyq ` AktSS< oĭ |OX:+*@JTUB"m dNJ"8dxz݇歽jWgqP'SIR "MPG]J1H6s@ QMހwK }Y]( Gu ^HAʀv PJZ%8&oC%Sa @pAK^g@!8vڜ^1#b- FjVS0hIMGhͱk-,$fr )h$ ?Aj؁0"228\JC^asQVc;"gD(! c :|֔hƚ;A.{mWBa ,X#i%Rh[VA*JnKoY"ZzB@,I>ONV%(C08̭Ah&v kam̽g7rӛ)Nq}y4xL;?N&L-8Y#+WV:lуЩx 0oA(:Fj IT5 Qjhvfc{"@W`-{tm A x Xs1%%}~ Vr)FJԃXFRI%QIO!HH@GX52- rBV"b$"JO. N"Go(N:fŽBl+J5Z1< 1xweI~-<f~1S"$%u}#7")$=b2++Ɉ ڽ` Tr`-`q)pjhCI{t4[ ǀS9h!"hn=R' rND.3JH>Zs=3Ɇy!t1ZjB۠5O`E |{P­uSM%@rI *ZA &V} :Ua0R$H-obCz.D1%np,k!Kqܓê` aIr /@BA]dVNe'[lKSY1D0 Պpd$l R,ѵSKуK1yk+݀+$;6`VS*v(jp3+7 i) PVy&}=y$lgGz~HRVH D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$" H D!@B$E✐@8!`|@0[N @ +aSKU_ſ_~ ~7>^c.6U/7>Oրku.hA\yւe9$vwT~ПkW'.ɨBc%Ԝ7/ ` wܢJEh䂀0 Bj.uXL9auS+9p ETVe%=J绐V&4+O(bEq?.K[Δ(̨bZKX2 Mrxدexc &PE$rp2R X{)+u•ZhLNhkRL2f&ۤͅ9.TɂZ l?ε݉Ac@H,x9poiàenGȺVZ2A<Xpo\R Ԕ]JRUȠIiw3osDވ籛|ܐm;z%!E={]T.P$T1$0_ɤzjbeLBKWd=`7&~ #                      _ ?'S`blRY*?o^ֿf{U \4GbNdu(3 RWr&O Ͻ%pT// qZ"%U$dI@Z);#4L2l] ׸ P kN1G[..am0uNws0r@ւe8`!Cݓm[j.}0Z1!D" ++EGK#k @VZV1 mDVXŊohkjMYy7N}pzSbi#2}?L@fb >BY6_ /?}';6_WE~\/!Z{k =q>s UARFl|r#e= 5>;^ 65?qԝ](=ܔnw<%Ϋ1³ގE>h?-+_sˏ/5j^?tŤ{qNq1ﳡvi.uE6&G/9zML-g`EhۙjF842Y%]bRjs(k!|2ŗt dXbt&gBvg]ugM@=k7fv *\~ ]:D$H:l~CPĝYO9lNm,nu CR[˿˵K'hIsU|vN:2WKj ͊uJ ZXS;>Q2[J6}o{jCVZecgEZjψNٳ!υVS'Ia!),YGw}|9 OdPU8HQHdMz+}I0sakѩW{ˇM>g \)H&B>Ǥ̌3/;!'֕[e>dvkV}Y%f~㐇l&]?`[o!yj:ٽf̚\힡g𡇷fV8\ΩcnmuԏvRjEwyz7޽{xXxϠ{Z\f@d&kyyೄ們W-d}'g>D!>auOgz騭CzAB4?P"${^`unnߜԮgaFJO0ݡCMܜȒ=qG9ړk}_*173 H`d& VQ+}>& `Ge yz춃&n1qkW..4tNӧ;Xyr''ǩӁibV b_){Y2H.V2T PV7v[wOO{tM[SR*!POC }.9Y T3V)]b0ɣuz^t |V-p{rx4)4"oU6y%q?7CoH7) p=y ;+h~h A?n0Itq|ڜh-}^R,[ólaVdY7$ (An3}'o߭f-gT^<:|wǙk8c̎1ǘ,Cxό133ŵc;|S15ګ ST}zJm6\$[ҶmȸނN̋EtR-x'WP'1ـMyP]ъN!L wUOB~ {T]2ǽƴ9~CSoxpuxj.QQKmEQW b"3#I2Ӛn4W&Y&y{:73z!LAC;Č #x1zB2ɚ̵C!UСݾ,{ 8Ҋh`uHttoC6i B6!E:LB%^.Ĺ(X.!!k6DeÚ&je:;$^$#EPY@ ]Zs#b1"fJ'`RSqvo M`fF{u͍F_*^ Πwdhm͆s(J/`IXeRAVUrtz Lq{>E}U$puͯG˿ןeǃs'"8<2x['s !#"DaX ~.!l'7ݯ?αw7P |(?)]W'P2/zߦE}o}a84\\6kܤ}fl?V;iX풠Vl5ZTi&JZzX]R<59c36Jl*O ٸI]<4-|\_zNt E\YiFmAmd2lYhaBA3e.VblIR߰KޤmqpEA\^3f+uP2<l2Q/s[2.iՓ#gi kPXeN:aU:+1 XeD gCqZˌάې?XMYB ˘.(ɉkVMRҷ1; ^dYeH&ۜ"1H$@?TF" G-H-8RH2RsxZӸ * \#"pVt6WHY]gp 9V>eH'ٿ\%zbǴq![F 1d#e1I:MeNN'w;4]Bj7\kUw7G9aRU3?d})Y_rGu5g\2qtɎάt&/G)#j~n4:)'#.ҪTĬÆ:ƖӅ/tZRgp8-ܼ_nFxzCT?eH}E_rxzqyzr*b mḲE4wL#M@_?uEzt^èdBO47UW7d 3Bd.0g糕zeWDhմ9pZ14nI#]4 #qqURyG< 8Vbb'c/MNͣ.iԦg rI C/q*&^VFx?{{SC4^_9鏿;͏o*~8=w\7HV`J i R=y o?5 +6si)륕}? dfJ(Z/0HϿB'mXth\̝_[ -fhS`q5kqIj,H3\P@ 4*F)]S#IH& ARCFLhh:8YN)8)B:;𚕺3۰.U>_p6< m#*$1rPreXR E h-Hjci%T!ۘPGCk4qv3Dkg η$~𛸪_OeҼhiYYz@AVWc~,;M/_xU~5KO*~+y&2-E'Mk :"o$,' 6]?>mv'R5%G|Q0I3+"&5Jtk]':%"<'I:tܷHR0b%![ǡKo4_;SGtg\?/t:sc-Nz~]'3M\,6jq7Jߙ^=zo~7+x5 @2=2{ͨeQc Yc˩^Gő:WjZOǁLRj%3goDJEC.Y|13[K3?spj1]Vkfh+ Q V>; B,q#BȔ#FvOm2Z8%Y?GPJ]y!@9Q)FOm볭o}zHۊ#K?CFj_;|1)XʆE)>,1d %Zd$.rP@)Q$7AigIʡ739wC7gɀٕj^>q9 ĜB)y^bY`9H+0q"dN(e"+q%mRY;\vQe|t 2ا967jis a9%ǘHLyf@ icԘ%i(i[cnu,XM\Po!MKʠd@넘exLԱ=ƨlgXVkP+MIXf)}0&x^b05 Y(ɀ PKiAof*ɠ> Ӷ{Dhn%+4ԡ F"e>2-g F΁\!@ 9 NL"#s(Ƞ(dJY !]ױG{cU]~9hBe0ߣj2wNJģg3ГQd,07GI%& b"2H Rx qUh{@: Oe;mh]G wܣv1WnH>wO1|ȭIV6* w:"51 pjޘf ˕ӑMRWgtFCW *|uU)Cnj1١TM^n]V%Th-= 8f(kγ(2n ܟ9I k-,LLJd8`\杣o$l?ڷ2%xO;a\V {#v͞^[o`897dJՓI Vmd ӫS"xAW<P0Rxgc-A?!qqR`yBHLvRgd;/w+^n }Hh IUjX>7^ ._?g jFkb^tN*fnt]':/FʜO_[>hmI-.#!8olqQ-L I-_ IQ!)i(3zWuUɥ~RUNKP-z!]h~v箪Vje$Paa"HYD@-MBtDDd4D 'fxH'&LGphjS2 @M$*\"e5pl3.ᷫ|>Lî>3;t<b.'gsj_ P8K (& VJhX5+PFK}/lFV)8X*7YRe.3`ep!׳ $p4AqDiCkଦ{@ǣp"-}u,%vEgYnɲU"X7͋2K !޼(Cbk^z 7/ԅtÆ-8RK.DY}䀆r`-;*[srQCh!* !8$ ՚ҵk$u>Cg]T Ks!NO4ǥhvo[ଶ{g/^Z`T[\phMR')RhIx+ JjFmFݠt;b.ϮT=(3y`D!$!}S+(M\|CB9cM2X# O"ER ytΡ6 /ftr.Y%vʕ[P>*(W.XPhG!ĘC%fY9_z@ o{X2h/*  s Z{ыUX [^Hm-=w:-`y8DL;I-XjWDRr})U%ߩ\ U4&"?[|}( ]Ol@PCDg?ڹZ )׻¶Ӵiineeo?ʁ)2YRJ 1NNt' vA`.ěh*,xQْ2g9K@0HYkK]=]֣O9t>fu륻YM#KYNuim|% #C>$6hQZGim6@-텃zYYb]v-T t/ԩy_}r,(h-HU KH)4 Z^Zs,xJ,>Q"BOO3CU";󇐠Fhr*-8"/m2(6$ }=8zٸ9{U);{-/ƥj?3.w`xYG7$l& @n~XM~_ƨ5Ne OtVX}|th9ҿ;ʼ2;s{y˲(LXQ<3Q,i囂ו^m,JW ڶ2XȖm<.t)ƴFvrwqEHn\ :m"N.K%ll Lb{Ee[.ij'!> 4M4lZHOcz;1gk4a,Z4Ӯ 3mh'Xks3zMGn(/JwoK|.w|TT1㐲)sN(DUd Jz1ƳepKL y\^Jqpcߋ+6<[Bνzݥo)՞I"f#gxRk`FJ* L!:0fk|MD_l\p U54ANJgݏW+fu<@'Dk Fxb)*@D(nu"kb9RwSt043KjNL"eD 662)Ot rK$X@"hxD <'^GC]]Bz3+R_OC]b_+?g*.\FE~#ogMpRiHRwaAl;DE^?D9q ҍNըdLOō'.ňV;.'g Nj`Q90[oӳzg'Q1~Ty`}4$WΛaڸ"|D0V w0bAj5{ɲ3UgٴjӳR*|q,BoQS#[(bXG9I0 qzޠ]_~??;Ǐ8y e?O>~x/X@ >l,9oOϗ qY P57E[Eq&AD~>p^P90Ҹ-Oy3w6#B47[ҺͧJ#7cuyXar f<#BJj@O}=$b{ӾXj%:D7r#e/39.|̽m0hQ3de2hYE4ކw)+]8z^ڃG`{\JZ+T($N&)1B\ $uZ V>U[8|@}T뱡:(7/z蜿t^/m'IK<iz$oI5cħo|NwKxS}NOC޾|p㏗ ~i+q 3?~4h톱a(%uL.*Y^VtV ̓JtysT'"n$E@.UGӡ7p|7s㽹l-PsxÔ]?QK}JtG;V5g ^n12h0.PcQ5EBxqیn8P 6WhA #u*`=K,o=r&CbϝP"xvojgOK)PB 袟*21`$VKQc`px|Oc="o]zpfa +'-x}(y ;Ǎ4ᣱ&A omTD\O5;sMF r}4O6?Ze?VCéPn>(&8.+_G|?{izcy0:h2<흏?1ޘN1_: 3G˱~Q =b2 P # 0lXxV I`]3_ƒ$blvr_UV]gWJ:v ٕ1G*ް ®ZNv]e(իdWFkI`.:ٹT΋)*ܽx yeG^;ߏ+FI=4nןsZN?R+Olւ ʽ fhγ BcΦ횯if[gWXBelaJv0(-velǮ MYË~="gSFzp3$7b IHJkX(#; !_5r-F0mr$ 21Ȃ0YjirYby2V;oE< ^ZAZuz/n: B!`PĨkLc$WON2M(%qĝTj>J RBL2n2xk-TflbJ'b DP|f/9 !zd*RrмB*:m@ZSPr*epKy}j[ȗ0ER@@r $P9N#pgkT*ǀH_92j :HLJSqR"XAeLja0 饈"bABp4ʞP*{BeO ='T~܄J̄!{N );%xSwN Y:%x˿SwJN );%xSwJN );%xSwJN );%xK{_R`YJ?0XS`e\peeSc"a"\X騞 \ܑ^䋶NRyU:'{YS€/$4@ND%&JHJ9Iظ)NY'=TH۸O pjۤ>mr8ѱ~Fy]>|-r7&y. Ʊ=##]h*Kmd@~N8il!CA$*$mCMɨ;Js+kH.g=R6` `cLtdb.Eښ V{Nܫ^YEr}zPlM NJd~~Z)nV ?:-#-,|z{zV[^\6 svn?7u`A!Dw\\g*e:e,%A{]u=&tʂQG/=ts11$( C56~ikچ}Xn_b cIF1TWl~=BzGZÄWz8RfkioKkJ-mmi#ءϞh 򀕀wC0[+{! f(M' fƏ3F*,Ԑ2D`IZeE:ʓCW#X~ ZTNR֢صT!fciZ)JjKP\#U@]%#aBx"tњhHIujl-=iפi=]-ДƩ<~jGNRQQhU`&߈Qf#"%cϤe!$K;sp>w9;hϕQEM*(oB%@>J(IB %f_h*gAҕ^B&P(XhHI[U.H rl8[7Y|PJb* dC.Ve!)B Y}7:Nvh)ezzߓUZ&ˇ2$f1`/Jd_JRAty'=; %SqAZ*cbBd2lX]q4RUlM|`oc-R$K֠F H8F}i  GDxlί~ѓCfKg -f(T1cKV2"b]*xbFR!bvq Ɔ49hB^*X:g ^M3Toq#n2L &3c $j"!)7΋]{)O^} RXI`(!j`/b} :=~1JVJeEQ`+2ZѢ*i :HgqTTg%gguj6PDd(%*ȐZ4fd:q<:Sa^Lq~4r]}1-&*Yn?n'_Uf,zİ޻}I*J z!DBk!QYz,挥 oᝣBi33"dҕ[UCxnݿُ6;H Q/;94b\EaفIa|C+dIBp@a+*Fs/C?rA)hPRNHg R^i[H *{x p= }Hh(51b dko./e34kX8e=G{)C̯!}f3wv\ρl~AA[~JaA`^;'vOoZ%i|dG'/mRA00ֲ&G6,T%ֹ_͆YY@ǣeYu{Wx+|GƯWfEM涃\;(}ip*Zu N~aO*>Ł[z0k{ɆNolY;-jݾuƛ;oR롆W~Y]:Gд^S{}4hmwmrs$oK2M%ϭ{AP;Clj fELhiu$y2Ba h=uiNr3^">ZE_Ql;"_,@Vu^nCyn=K Ͱ ]?olg*VD V|8;AؿR8N .뼐4Yga 5PR.< @2+!J`(>-o+_J2,7NihstN`jNl6-Ns'? e\uBʸeӂ2p""ϏCYjͩS4kLlpdAEV+ki zV>z"TX'4Ռ\bhEKŪ&,>(c)&8 A) Tkl85cCtةsyU_ -ԓI<=C8;}?;\~-ׄ u1<΄\#OQiU "8։Zc1 HH[1iz#V,"ZѲ锆݌ɱ0e )L/8<^ZHأ_.cH<.O#^G7PGsOWЦo\[I1_$I-8X{O6Y~֤7<:$GUqƣ3ߐΆIbb 4U4:_ /MRvFrzbgdLrn}Qŗ)|d3d |xC%]AzT䡀gW9ַ|=a>oQ&;:=Tb#/x\\Nw'$ f ϺK :޳`uxc uxKz[TSiP2ai<ldm[*FsƣI=wM2V-vG4egte?k&f«07[ň'],GxHLE ڊmθMNx=\lyy< 7[-xi uD tK|6xߨP> oIO |C&\$2A``ɖ* AlI`ˡ)Ϟ]bB*š4[-l:x]u%Bp ;TƋiY3^*EXn)]GrI3jV]!OŚۮ!2Nk`ο D޺D2kEd:HBTV"{DN"8~kFGĀ֖} R;ZE6ep( Aś Fϒ㢈EdY")!LT1|d_B°PŊE+ ֱJ,!I(՘$"e ] xE9 ۍtX|\M09O]IƜW e̬G)T`"-*kk%HvDEecF )I2v@2]YoI+NlJyg~u;3vzyZH%7xxɤHj,VEfETJC~oz ۄJpK0 &(4p /Qz4 Br5ؗ(5F{ 4}S RW%A_N_B./E]j٫BշPhG_A@Û0S`c"?~N: o aיCUm ZjeDH $%NFPo T/돟_fH>!L|QMً Ra͝Zwh=^mRbMF i֗ Iq+z)P77f egُFK4W+eXc|&Q')_q-"M+Sz7Q $Uh!E?nEoL8z3w´yc ۊzo{O_V/wO~Be/m/앐DoעM(U0aFp]\rM'6#-/$Vȵ 1eqHPEľA$)SIب=ek8ozpoIu|gvey >|}o vEhÚD?|O3eF?͌GW=xC$Qo+.+@7S((ЯKsS-ݲw ٍzy< C7qATsCWCJl{rg?5٭_0y,/1( Y=c,2VJڱl5'!U'\$0BkͻصKgILQPlF:r <H"dOnʟ臂1R*tRNIZƌWK _pSjbJ9sʇ0U!о\g|CJ.igioTWGQkX@_1)W<3!&_EIȑL;IAB2u[Mr{~e%mdS5RAgh+s`:\!XdUkK2rb P%%۠!3d@kuVTj䴔F~bhuQK#3ĕ灖(q2zRv[%вDS m}7:u"PSn_ u =}j9iM4*R³٧vk-,͜FsjUZaOzC:De.gcƕ~&q@,9*&bAd I)J:T=;FktٷvfF2!ӠP H8D,:Y VumO mƉkxГsj,§B􇕲6Okg,nO?@#㾒c?ulGx)fJHg5.fRZOlxƣ$*!MhyX96zb}_'vJh͚;kux7%vVg wbgQ,x;8Fb$pN X/vsP RIQ?^g)QV9;uK43Mm{qӦ.&,SWืdfv-٢~f<9`ɀp%4l K^=yVuʨ=BݾXdY\w5Q#}R]'`6VTOih]a"ww2Pb6OjF>ѣ-* b™g ?>yK֢- mۿW?mhPTߦqjsiܢ}d 죰(}\Ǝ>;P`y^c0_L 5j=E4DϾ k"UTޖ -*)d.,rBҖ:BnV! ̄B yȫ[N)y\N87Cmz|.qYe݋,H_>(2ll@Ha+~ 85*8*M$#ؖtӿc_-[[]_޺zx2}tP+[KTm: Th!.9iH dtM$ļs@`M9OFrWFM!dȠU+'QU#gwctF5H:2 ,y^㱒 j+]Kݼ L>]wnrS]I7i<!)i);:#_p*ˡCI$4}PntbyQzz^\c>Fz9KvqE# :~8JrҔ yA2ZՈt%m1LPEgi%#r6MyR0뒲]Wwߡ1j}4JfޡM׾7}ś~v6 ~  ܌rxH3Ywh4nZ;bq^S؎L"ǂd,ߋ&v=9G6d镬xm[Ex;w~@hZmB\%U\r2XM8 QOЌKCpzzzWpK_1r9b+h̤[+CXUWɅP.R+_n%Yf- FW~U#gw:מud9n#%h%6zXZW=CWjAm;c4="9 89N/kd#Oਗ਼YC^39nX~᧋츍 (2\$3\謏4E I4E-Wօf[u,znhVcO Mtlr7Cl_+:zg#0}OfQUҡѓQ- V3-'/dZsV-NY7x`C.~Jnǿ<ꔕA P D&'(7dU<.E*B4k#`J;gmGG !+{Pa}BIFKn,.SόgJN2jAFnu"j*fݯ{qeBv8mK<3W('U-rwQϨ5 ơhW2;^hHTޤh# "M*FȝV޹Jri6|_dGSJ}vc+ck7q0, oIW50ŘTSByYDq%J%Q΋9{A8f͸J3ns%b2Z;Ω ,e41C-"$:Crrt9a*)b }NhIU֖ )!F^ӑL_Ћ.~bbB7RV7.\;]>. D_+OI "?&3C^+g?~Z TbfoeP쩁,˞訉!:JUgV\r/R@Heʂɡgk\gƭN@$@LmeYs6̻@F/|oD<]v/oÎ!iz5f%Wx|/I]B&@sjdZ@j#PFE|y cу U%8XN1mc.dUpl CN1Y*hPrǔU#gY@K/üi|+_{S4,OYrl 4iL79%Yj0"2zO<Ւ(y=V].Y:m9 h%4AY20CDԫ+.RN4h#AGA∨ $)S' S::XCҲT2ǔbSĺGP.EjZt%a:|YQg&a"ŶBgJ!tٱ]@dH)csukE lhP1$9B<5) ң^tL\Y`ѣP-lZ < &mL, JlO}DS*3K18o Hc!)6+ ݄P̂cAQR˂ I{$%BƏO˖貊8oN:/*1 t2vN2VeC):E9wBXEْDOkSj5ײ%dt+]~M;tE{ng|yW}׷qCe[ĆgQL 7d`>LMPH4)KցDɔ4]wۮ+#82+iCǔ ޔytcY AT}IɥU@)E/Lx1Zȑ*6+ǃ߇W#;n~ % כܷ0&]]+dqj:< 642R gifvǭVisO2οkMV&pmI"_vۑ!xaw1~J)R&)>$q5H{gWWel{׹0;i?\[غ\n]Twtf{XhC[ز}{U;ӊyv|7[[ Σf%ʍY}@;Wmm[ܔK\,Ѵ;!S]y9/r_"s\䨸CyocMs/d/TF0g܆Je򀨲 ϊs'.nG PT - b;+8WO.l-]ׁC8!`l!.2g糅Ձ9Y #]5;'84pv&/ߛd7[E_H%%w:|ՎD4Wꐤ6F&*%f*@*2Q0P -kg#wzb#any< D6=x;!*&4;KmGUXSx'IƃyK(9QMbxw&:Pq'tBrx@H2d=HƱ`c1(T Iu;NKI)zZ{>A 9-Գ3_g AaGK1ux8ìؐ-OqWY6pJBF %Ѕs@qJK$9 KRiHģ T3<)D “Xrx zi 1Q NsEʃ,eBryQEE5= ^y5{J)}RQSh0l܈~lȶ>I1\Q RWXC^v6Ų.Gʧ rdBH u"&"O5֠78{jO 4y>/(%$gai:1XYoAA//ҶAW)n{v/\j]V){u~olz,0Ѽg ;DYe=tstCuhC:&hy0CMК3ę!8"YZϺ|r@;.P-Nsw{+I [<|ٸyF.g=Eo3)YOCS '̗#;uO tu„3"_1M|`j@鶧WN>JԣԮ{x2'i4]73\xv u"UQ^-,3\bŞP.REnqf=tWMMb̺ឧTG /B*[eg&#޺1Ф΋Ǒ2ܢGNa'хtK_J_5gh6xݨ7|QRԬ8x;2t` 3DaEL8!ri Xlʬ& Qh 2Nnt&|__Jpسt+6<:+Bw4|U~u3en~']|8=.<5ރ-6(CT{Llg z]`w^Oҹ]eջ8Zzh<.$s5˓ڨ qv pG<1QF+PD8t*m51RHB#w`p,8%JI4S.hCS0$EΦ26xTΆ5]lbKmel`hؖ[=k<|]IkHB06u&0)K'g1A>bM62 βc!.($,}>5$ 2:'t0<}5ߩBl8^/W?:wǯ?wWݛ/yYi UJ'M_'|R%o.A/ן3~']+nZxI`y?"eg8,a5);s ?8bhкUhb\I\rø;nN_{1%iJTB!l¯'1V,!" +WX1rOR2c7 "fDQs8Y4ֆw)J-WŤvQ!2$ E $%hNBE,JHwt9xpbxuG}Tu;*յskrcҝE rU2KH:_ZlJ1xg3Z lR*^(uQ7;hIsEbP-E0aeBtCV넌vliGt`8.`yCHDz"\CL$]Ԃ_>Av:Q~ܣ/4I- &b_uw7Ui?8 (F.XJ'ogOBsݻzW{xq o2"\D5Qڨi1ZO^z!IRBl=..;./'DORD.D$+Gocx:lP ;\+/,笰ik~[abnCM{BaɳJ?|z'[I˴@bE;HW]! wyԯ?J;7q|۠ I)q#~&̮s!xpцu ܞ_4Dj)f.mq=x6x&[ g>?~}t2PЂ5̝Qyb8onMnЫg JxASGS+OWc)g'T%^V|8Hm8ɧ0.V)'J.Q5NƱC+q t6ŚPQ]VH^HS {}oaܬ]W+,ÁRCf'o~ \5ѡ{BVΟaoT )柄cj)M"UR<B?igeϋΐ.HKHE*'pҙ"jw^0#Ό 螙~13yDXG9e,+c8KAY})ʘ ^Z=rmGz-aW^] Ҟ]Um{vI=rʼoPy2~~g5r>{ YV8֔Wo~so+LSCN?jV 6F!CWj-{Mu״noCߤ hNQ/Z b`.F#K\tZr'*MO2O|!뫝r3Wz݊x%K{NM *T0| r |3X7<*d3&G@MyKZژd CF#9(G~ߑ'@zUPqf;* TiV;?AVN? q (ؽ q;u xPT62H=vBu,@֫)3$4Xo|Ojf[#[F6ȷm-Eym7q\ YeeȩQFrv>@:D&[0 4"^0WVa')Ns^K&)QZ)Q ꔄPA J59xNeD.%eK빱1$)ۓU U!gSG IJ59&'$*%EBUR!.  uٿ B2R7T& K] Ho"u <-՚1HT1w6)xt2g{ȑWbm0 ng0b >YXr2}+eܖ(K;@M+֋UM-:qؠS,YJFO*-լӆ!+%^WDDJr@30tՔݯ%HWD+P z*pmbikdj̓FG`(QIWV$(:AI kU.*stHII.'(Jt--HÂ8h7GZeT"2s\=oH&*:$.QCˇ^IkPTW3`"HAa~$mBkO|""D"*B&&v.FB]-tc$QJMr!WsPIxx0s2.HVWZcL[ (Ds(м%jsPb$ b B#X$v́J5:śJOWLn-msu1&/ZBZ<| PW_Bɮ+7YWϬfIZ Wu7PODid,JTRhB g.dHCv# t@$DT e–9\s pE}Oo^P-[*PHV'S.yC1OijTl:-wSu elد<յ]f%.[jshlnrRָ{'qjQ1tYu4oHYk<'Xa [[W֛tfs7f!e5nZ{=6~Y=/L9궹vwnCajg~M9}gA+yBvһھ4?oАQem2hڨM?E;1K[H7L1 +l.ս8ly*nYje =HXsRI; +0^0..#֑q했w \|+H~-z'/ s^fΜڰsWrjd $:>?OmA ͍J:ԖP`" Md62HBS UcQ-#VypYS uv4lߌ,ş6`06'/cIlx8ì-gq9WYJDFsblbER $)[$@XJD]K@)p$(R/8:E$5(Ε@=ec/cPB(ZbG-X@#@ q*T(@ig=`p'6LO?6LoSTDw=ƘRebq oN U%,!T' ,XkAk1Gwnȉ / j0:? q%u {C~j9(9@oy})/nﲷ:#W׳3%9|K"2#EE㵽OMr)/c/p(qS|U34W:F9,|nqwt[[ -la峨6k VsӸಖ;dAff!WV'Ƅ5-ݘHUb[CzԖZZl'8O~v<.cz~xcrVL.j%uMB+bu+:~8 3F*)̶e.mo:'f +1MsF%E SPX`ρ.0!$ar}y,ǿsOQVl:z^u1+Bz݅(ԞkyP9g|_i]CQE+CK**P}ʥm Vk )X|) z4<[ QI`VRtrQxNH iԁ'&$8IV'MQǥAVS@# '763sR!0fGmiDQX/6 :?r~Yq}T{gs A\ҦsF&'j9^4&XTnD)Mʋ2uV< " N$c \U(8(.3P+G1&($s4ֶ?8Xې)UL$@Dї"uG+I~/m%<s_C ?ڟjkjNSn)佒 HH &O, Lv ; Δ+=$]?5>m k )z$\~?˅N&r+H"y~llTm?8q">VTl 5 daPN%99׊sÉnNd'0RqFrkkYL7Bç*;.[MFŦŴ\?4,+d$ Gß'uwp4$GrHMðaիL*? U̻h8/vEL1yK6Hu 0GHX"utrzꔞhy;]*L;{Oç}>(3O;Οp&ȱ 4Mk YX~ \/-{?]0~j@vErSԕy+@FGpHZUqo^qYv2o~jE(547fh]gdb\IJrø]=7\Ě1;b >,.BT h =q/6GweDžf\Ibƌv B*%s1ywhaԌ(jN\2Ԭ"kNIzjkÜT5|-WŤ $ 9!IJp, IsqWgFVp1o '`Fi xJ<"ER儌6Iy0?sw8;F )$CMrVmpo TER )KMHQ9׌3e EBjLi’`!5n*$R8qY9BnΗ I`Os D 뽄"]``j nhA$1|oAhR}} ~cHДhS ,؉ZD0tzR'D"kQTYKR])H8Y+#H% <)cpcv8b4B⏚Ee 6- x a)$d#H8RĹ 2 VTiM[Dpn9r5'=>#m' o0 pͱ%) hV^S^O})HȹQD*/_dgA5}3M޾w ~N6}?|1[vxwTn\(E=3E8,8;YȉoޜMe7 cx>FpKjC]yԢkȋe0 ޻=[Afz?O/O/ig; 5ۻo3x|p޻M )]Ok]5,{7T/F_9ML07PØ\O/UU!muJLPD *gvA0tJ8HX\fWz'4(wmmJyřΪ4spyX}^Mɑ'[Ŗ/ئnb'Uf*CHA3)'1Hs溔-iʱ| ͭ3 9n_|x˟XErs/gDWö744Gz7qem΍i\ҬS\E h: ;c~zWpKp~`~`~`=7eʗJ΁A@pbGNsW0G&z5Rj${K&lA=GYxo鯰ƒt?X )X̔WܷHi?rB|>/  " LPkPEcoѼ<(Q-(3UuOmeRL{>F w0tJ9)Ex;hIIFIaHo>F4 .슴?ZL3|W\=M`'6WOU 'I+&虹O0WzΤ'QX IY&?种äܼȎ`YTNC3Iy>2rYld9t0y@GWx<1ϓB+KB@{}7!x PL]O?ӋAXį4?.u..Í;zkcļ^Mn#z |]:!}\p'ssI|dTѓ1,Q]^-U^D 7D*_D 0 | z'8 PF p qWCEc2z9p(R*݀{xzOom<646=k<6xzOom~Rm<6ަm6h<6OdZU6V6xzOomh,eCI_6F|و/e#|P$F|p]#lė_6F|NZ\ckLrI15&$ט\C'A^F5ځ;/W_|Vlz@u3a_ ul%6!-w6:dɲa9AD6=!@!k+>>!9nU&Ĩl,(Rh$sV))S܅!jadgӫ&&WrĨW>f߻Ǔ|??G#s4Gr5v? ?Q\X Y=8y٘.ѹTN5BfW~ %( A'JYr19l-J%LpJ$2qeg5qByrVB<|=6Bw+jعm6K^^ ߮^oO8 K#44iMeL B!|}U1cCU uD\( up0T`/t0ґRjO&~ ~g=My[Q{կ= r_<yэB 4i֭:z%MN;<32 ‰]{5ߥzwF/"&h.&H 1Dryu-dʉ%1IZ,? D]] :Y`VR]4O ih8=#[pKQ@8dC(\o>˨7\DTr$Xr,lXPR?F>9JޔV`*W7>H B[KMB($yAytKIh=)5YT+C4?R *1 MYbz33'1yy TEi,M!I_1CB6ڑYX̄T겐M^xPCĶ@e)#Σޫ@zzIJl(IQe!^֢̱$Q>,z0xj:#P 4w ]{}4&V pFQG;ZG|BC< s _:iiYAt":)CӑrR嘼w'ͧ_ǻs VZ1'2@E;x<]e퍅)ׇmCZ'BZZU81M쏟*) tNkLZY@VI추ݾF@p^Rd"8MHEҜ RHuR"U{)xRK_bRbD̑I*G0X?⽧j[淐;]_LG۠R #[%/W_[R:92}튰{Å> ~\-{^‡w=\n9rhRs.\B.~$ `UߩuG_Zwֻtzfw xwCjQn5[v>W%.yPn;N~̇$~Ӱ螎WIdiwl L㯛Ԋk&M[k$M/"Gzsk|s@iđ}s./eśywQm`1ŎPmTNCr:#NI`uI~(۝~Z&,i}t}R߿~ئŽĭ 6t6-?L!6y^*d B I(+*T ۲}4iZӴcjڠ5JQ1Ws?JOJڕҒN$VBWOV^M6Rox%/TQ⒓MNi Lk5G zw%y.EnK]0bU*_كCPQ(gv9Mˏ~K&zs|줣*<` g< a6\}hA7|ӏ\m:R/efÖ5izK 1*2<M rJI0 ϭZNqD ) |S BXZyWsa祻{[6tD vM1bN1yMC+VSWV߱N,݅i˔>t씥tPOZRk/BVZmkظӫF5 7K">[2m"~Jw+z10s,jSnʅ XjxdVw-tPAn:60]XRi*cVDZ>׺'ϝPic.B@' }=}FnѶ_wʛBQp<8}z"&IDcP^d|`Gͅs a̚dtF{XB&o>]~˱jscK¦۪^JjT{@T/fUH`,;oC˦CBCL LD_C6Y5}ŹkA&qqRr(;Oad[[< K#I-m.Yk=heR`hB UiDlioy؁f`Z$roCK9e+59m1 2Ut 979E!hՋɽo }l$3aMj 6[ʩ t@D;HhUf ឪ E5)OT{()&8㒉Swd\=?I\M$OpſѐY9@|q!zFfƖ Z~<ãòp8-sn>,_Gx{C\?EbD׋͟Wˡ Uږ컽ɇhӯML=M@_NlHOtKUf&[ӽNj[~yWxs6=_x ̮c-9_<VV˕]qgӺc?\.cM#摔yaD0Y*G|,Xh8X,rIzyr>bM6͕ Q'`Rִ}:h Qaxo4;y#?Оav/Oi;?~,?߿9O;|C.=| i? LIbak"A{ܰrak3^Z)?~>f~Ob9Xg(R/00< pRP`ѽqYf-Qi&C6g=YSnW%'ƓTAd)GB} `@Or-'ŭ:!݃wn$e`*Ę%pct튆V2d9ERt5+ug$=a!]}a㪄P W(2,*]ʣ ^mZ ԐCwʡnS'Z=o#jm{'ʍKW&]nU]/Q} he1g[>^q4N y!v_z«Wׯ~+Yӥd4F=8kiG~ZO?܇Z/"8~T8%)3Lee^HRZ ^{:,՟11tҡ EC}- M:zNgglC-ukc-r^_foUwo_̋/jq7+Lߙ=zo7K5:@2=2{Wͨe^vc%y8R /oW/pzB?F"ongK2Jtʞ1* ,UjUE5:*hqX[mJYB(ѓ,SL9EGB2y[ % UPbId5K{ 6?ޥ&AG^9Nzq؛ xL! mX ۍ /erC-x$CY:5 8Dz-\,H6uP5N+S.p0[_I.A ztBQf@+ NK'^Hd3XEi PV 7`Ԋ'H 8B\J΀f*J. Ӷ{Dh_ҋ Ee>2-fH$\SNmdH"GR)A#Pɔ !]أ=X*N/v;Z!ri;=g]?>RY6Zx)5((.䉂MwW/TBFbѴ]iFr<@i_Dh$/Al<@G̶ uf 2.gE& b"2 @fXP)bP\)rUh`OL}TKR/|[H]]t3uc';#rn5Q9vБ5BCG&&.QK,D]u>PH` 9A%*CT"X%ƍ|ZsHyh0J`Rq%x:gKOPqs`eN]=Sw H~jb:i;wHc}+I d]I/*@wPhpaimh]p\P^Eg* +Yb~YPuu__Ѽ_H'PXBLPn1BxkK dݪKjͅG<&zvYqU6J(X;Y`鯰um7{l_W.yQiҭ- #Zpxů9N;gZ>Pf5^5|9o@HB c.VJXy)E_d+ZV+@[.d`xvQ* f7 dR$R. n˞ U8򐸎.r@g2B]H$z"ddn F3CPCIĕ\._})Gy&{=6n4Ѳ!S2v8Hm%:VALW*f[{)7^_?@,f$ JUW3a7t:u Y{ڐ$fTF,Y6,'zAdr ΅t뀾]*KSR3K< KJ>mZaRdm;Z#nF2q<ڭ[ oF;% qLvHK<+|D|pwΜ霳t rBi"YWqdHVwN rǼ00lXՏamu౼k(v-|7Odp+Mڲop2s—̷AiMZblɀ)q ~đ1%b]rKqW*v ;.k 5 (6ge[!A20s'(dy麨V2HBWkz='-Hyl5=uiGoÏnnӶtb>ksy6] V=.sZwQ u].sJJ#]ǴseɓVXF@#+m kʋi?ޙ =% jFЦ5ʘ@"*<9qJ٥:q :΍sP2z^h'ʑ]^5rq h4Me-uή[] 6EŮaO~wEGԒӨ RtRU`FUZ"!(wݒSݥvR]KΆ~)ivgDi!D&_hBbF!9!8kedh@AJv$#xgYiKDuA9^N4-r$x@-rqp6)iħA?^]?^pQɘVJ*{dX YTR=F6W9IWhZioBB$$YQt H$YXRm״o &!H))ahh ⛰6$?1E, !!,B`_١tZ&0d;43Ke3rn1–jtL }v:qw2nQ5_\ ccc6ja0,'‹.ܕ/b! f6(撉cc;cc%OC~Okߵ;F'Mhoqu'rj,<ѐ'xcE;՗DjV((BRJyg*2sU|NQ(eU{}#-+}8P:t徿wö{5=Y^QkYוmiYdUNt{O5|jNlԠVJ]cb4Rt&.dwWCv "}fXkT[sI,I'5x륌zd=Y! ŗW ,%g39qQ }9f%Q *a_;n4Y%k7x2K$d'U!NwâjGϵIF^fyC6ɜ39bf&x)Bdcɹ-vdYv˺b)n-bUbjZZ ?UA^ȕMrʙY^:LN Rc[Wϭ˺q:GG:YB:ճYu-KlY6ܺ~^ϷYF -WCmoj?(~DWѥF+ô0Gs֜ tݣPM6?2hڨEiMҡ^zlsc,Q{]MF76:> M'nPpu8E~6>-Ť'9dP"\sJ+FX.urv s1<,M2啈&c6$:sW<.xCڝqǖRؙ޶~d/& Q.x#r6* 4$(:|y# hv =gQz[*ul(!J 4@ Ns) ʣ,e @Qf;ME`>.0r|#GZ}"Pty?O7JZ+=MT)}DL`Mє25R$Sա2T ӗ2yL$gt_de)=/[m)&vJiQRXBJ!)Ox`bRLC*݂%,/63O_0,_,V`v% dezzX$7C_O7OL*0c ~uI\MgV_y]ȸ.&V/I-{I]αKUGyHXN~%3A.3 xH";λ`eLPp$5³Ö Z+1ΰM\M9VcRIu1oC}.6nU:хvk_j_4hzQ@I?v87 py9@Ҏ /(Ȧ9mEV`)(u출^`BHhPnwǞg[eA_ZU͠'Pg|ݗe}YdP`FJ* L!:0f=~M1ۃ Dߊ7%bw]U2bq=h[-QRI`VBtrQxNH iԁ'&$8IV'M*8AK2F"'Npolg<B`fGmiDѱ 9k˟eӣw>HcL A\ѦsF&'|M( RyP%/:Q$g]Mhs)mA'mp"+dFV@eFq# C!ꋥX/Xa3RH/!'8EV<.m%<sxCI}znQ[Z@ *<喂I+),D @#D'y49N[ m挸QRJ'4IRsELJT-Hьi0\\Ɓ&vs2)wtv &W.ϊ_`Z;rea>48SBS/4ti0y2{Y:/X3; LאrѢ;n ShoX8%S2{&GqB 5E%vrENεp&')ڲlw4,G(\>A@r !srwATyeG2hɧT}/oc.\13|/\%FE^WWM:;;^^.UB|$88=ہ]ÿop3?ݪQydLŝ5yv1U_^L/7,*fQD1f\oQ/vfDh<& 7OF!mOO']=?,{pѸO8y.> &zr`jp}p~lmOm+eΟ: 0CH~D:Q,TdauJOh1O5N8~ y⏺ d?}wo.ï^{2׸ g` I \L'[90rQ|;|:}vV 䣈[i oȣ ҏ pY'LYieeM/o~_E٣}>Kl:D[=s#e/39.|A<ؼC #eDr p撕ɠeQ_\Fҏ6̤KY?q8*&! &%T(Xǐ:r)H|ɡaSu[S-wOUmvzs4^/=IylQpa2=(&JD9.G-iL X)8R$X&S>vBF-"[֙Cu:y4iiyCIm"c( Dh,@{Ђ_[1 B2pSo|ѦMkPPR)U˫ۤa|3@P |PF+qRwO.mK_ϣQiȉ׀wwh>7pOX:WpTzZTJZk+U'RQ%hEdr5?bZXRIEUs 5s_?\+m"%ܗYUZSSɸBﴲ+ߕm=-:g;-"_.y ;_e@h TRbK/-|BJrD\&ؚE%exTcɱ5mDO}f*4j3*8,q6WW.=E5YJʼ)({˟YTV>P-Ēj`WZB n1/8W/S d%Z|sj"vdˣ?hG8Fx6 Wd)JQߣx|5mA7CdWb 7׼1)Ͻ_g0(c֛/>F"|MKLfȥ* Y.cQRXBJL|r%ES48b  3-MICኡV19[V)PŀwkBK=L]JTMY׻'_/SgG*:gŔco?{F㿊{ho{ؽuwj$@[Ev$ʼn/-?ZRʒNUd[MQh6q$cλ ^(i%}~7P=3mhs cbS@2]0 :P˶r(G9 rHvcz^&=dݣR8~lo~zGzABpwnf]д3㉽jhrpН"u)&Չ¯eLpݯ:V//f/tET1טQ0 d '=!PeEheeDdXSlSV*m:DBަjub+8clV"$R_y:ՌޯZ蔼_ 뽥7͘3oHumqxؗh7(a͡B)vӻųɑ5mw !zrg]Zuh]B x0A< :ZAW؁Jo  d gJhS|RsIZ\(Zx16?$SJFU`#$I29d \Ii1$4W`b V BKc(،=BzC/7k$q?Oa6.o?uLK[㶯ߍ] ;. ]o xHxU[s=z(/NX*hVUo, vY;nlEJl.Z' ɂ*.BHXb[h"Pbmna@{7*O¬UFi׌Hu>DeSs&=NY.az!/|p3 |LR:#5H^RQکl]i![e2 e&»Bo$tu7XZCvX[2x+/#(|;7Goҡ} _uǼHvv9k`x'aـq%/% aTFk+0x '?pB$\%=C?Eҗb z!+@Em!-Qa/0De4` =Ч[oGu5 &Lm3[?8b b6Zc_b-fet=>y#}t~ ";m}b)]E起 ~Tg 4Hb͔6zUe%I(<aAH58hS4:g4dPRdm+RQA. ~6#g7\ i^ wvE^uvK+8=cͯc6onoow dz`HڐI3#0!ʵ%D6$0M%8iK !JV&#Pv1YA(ky?,T%[_Y@Wvڻzb= r]1XG,NL?J^Vj:J3?*'n[j‹uv&Y %Lg5UTJɹ A)6X e֌v40N)xwEZ m^td4׾|]d|H&Zv 2r&yme#V{۬\GnCR,^kG& xS,.f ,֐5E\|%̮q 2Ij d"DHZ[rH϶mb@ӺG+r> 2n!cՕ}q$RB" sLE؄X!) 678*.k+\Ze6RzJ*5 %X"Ţ1SNWՂ1$2vJ2V+jCͪV"Y"Cr!#6QQhlɈ.gIM)<C~Ӌ]}V= N/tz`8^Q'+ʟ;Kw?P3ENul|łY㇐U1#Wnx_: w}maxzr76GY7i ii2G7|mR>[] :cjsC#BVC!`08 Tl,P$@ҺL UN3زRj}:WWE"g d dN՗4*1"uAKb"z; &t൳c>I|MPE̴ƕ%@F¸h.IBQB&[@Vߐ2I,7^itb^@n lFn7o4}<wa`t&|x]tsyC,q xﳝ罨qcSG4dLJYy[( (ZdPbPA] xSN$ 넦с/%%6&XTz(VY,!)PIJflF)n[Qqta+x. u! ё[O4X^Ӌ>U9[Ɔ"VɢEQDypc&$sl9T[,A:ߒҪM*JΠgYUh(*I%SCjέ٭7S͸Z{ݱ~mAk.VFG"m1+ե9h WB &Y )X6ed( [Yg]$S ; &Xjuqfև ~Χ*SшQ6ֈrЈFl6[-kU H2r9KUʦ׾]I$%]Ka8;nĘوmDC ّJkHv}; xF?faAԒ:茵T"/LB k\ׯD%Uqz(CrVy܍//B|b!fެbr>MZ@ p|Xuy}z׀y>Q,{zϣEqb\3qy$FZ5#OEZ>ˀ 2ZY%$I5YYT@3))=2N/t/϶-;o-75m>4 ry:r߃d `y<|bY#WGA(X(ebˏ7wjԭtQ?ͣul۶qUWOVŃ-wJZ |t+t -&ps[hvxEgoeaf̅W&a}lq墨xt"c5LG+{y\ <0Iлu OU,mx 'kTl&su{ǻ= ii 66JD t2עʭ2K{*!6 ,RZ4A8!ˁ-m)#_`B.lͩ5^f-l:y]u1O,{ |:C?}.TaO(mlJ6Ǯl;?yƸia<{bJV׬h%)P]SBI!P7;q6/i|w> rNЭF\rH,gBRo}\\^_h2KuYل/t&.bY}"Zo>}| 5<3p=2Е76x=zg?jrxu1 W,¿_c=# T'K1j:<솓lr=oE?~i4l77}m˯Bb=a XMMQ4) 60洅Ȕ+q,ۢ,T, '&3of}t CQsET4C1`Z$']fhjr{rdר]sTe ,Bԥ-d%fup6#t#w*';̍.ˇ?{~~xsߝyurހV`+.({27 [9#7R){M.Eǯ8XHn (Dȅ~|W6T7Ku.Svނt~aavh:C>gݠr; |\w+ȧ)|[]PZ MDlo+ۺRvov&/H"F42\ ӽ:IhXP`aQyT`Yy$僑 -w+]8z^PkU41J[H11'@\0Az \E;uz V>U_8a>b|4L7;NKK!yD 'uz@Ҍ%aW7KqU>xU|NOJM^3K1Gdz$1[ XK"V4p)?< Z m4(k/Wsf@dQ,VtV [̓^oNDz}$xD\SqBxBpgeuûl@sxCnuZ?~]y[u&s;@Q۬>]NfG0@vY]jtYl 5/;<k^L'_FɑxoH-[G0re[ٍ}ɹ`1X>^ciKy$"kY"MÉKM>?xң3 {H%|0{reyÑ[cR;ĬW逰 9HysQ\%;*.#\fAdzH7J~ؐp* ]"A_2)|{zo(>J/y>IO^vja`9a-(,\= X뽱\BپX ZvLPJ+.gU\'Yȝ/\]YAFq2ai7GZ)Lk7ho Zw&(\}9V j\Lŧ춗AI>raFr2ď8=B4w?>Ƚ4댐#;`J]zyK> `%M*L7-ѻΦV1Ȧ5r!^7eyar*Y%JNőp.VƥMK"2)"؈n]O.fUjKڼ[ȕ9T$E[?.&oi <: yzuE@6_fr5l]U{wdB0LV׾]':8?)zT:&7Ei?v׺^f;麡o+1;sgGIloC-6&\Pd4J@Ie~jԽƜ`r &@.0jLjM&2qu@X`e6%?|4xdbKW^^t;09%NleG~ &Tp#}R#K4p # -hzWa`HThs!,Ѝ$S{+%RgaotZnB]U>%2ͩIYոq>&yQkz]ٽ# %ei:;Y6VV- 2Uq|K*aYcנn0+Cʘtowk}#jn:V]|v mU{-6_ҟv`x@7$`΍v3"ا"k/%%H *XZYh6A7Ir@ _J*҂3 QnX1K|/y/2P@a>aޱ g&(1Q<D˷SJ+;I>ԣDjj> `{@$mqFYg%.w>p΃b{ǡY^wU&/aEoPRbaJ³-Јn[iїĖ̦Aj2۔ÎZl5rA6=\FD$z#.;J :чv2h_j_ʵ24o7((-"#ޞ'=F_] YgAx! a C P6&Hg pNp+}Wo<38%&|]^Jmob6<;B*z=TYDLqOoq>G5EjX [bLtНjDMD&7>4 t]-che# * Dt 2,ERJ8!jF‰,`8bjU<0!S&aJ;f2RA+ DX/5^#zV{g9E|mգpwÇG!2Z 1DI9mBc'IiC&Ϲr\X́J/:L(]}UvP%ṽuEGGSQ:١n6O:a";i*AXPYgr4")cJ&WzdX)\S`)"T J1u,ù_3`a]/H,:C0.tHBh+[$ ,4ߌ_&h;3Y*?p]T-#ߚ*>ttt8;8\|+-l{718(fb%ȉtFJ3#L~nͨi͓~\uoU/\ΚNj=`90먈YT߬삓(+&ċ3S̗v xH/k=,aIş`*&)>)B7cGUSݓ#h$F+&G-S`!>?N.u l!,1C|Iy~ qS8G٨_gntq_>=Û?'|x$w3Xt@ٓA}},,Y9oO˯o èt):~@ EpSFI'B.<>gTfEx򔝹 ?_Gkh*вY7hƸ9q;.=2ם(ٻ6$W=]3€݃6;/2#R[&eR[!JSIAɨ/.z  XK}K}>%-uHv΍$Ul |ĔA:a [F-@P`*3^Fc{ҥ1z=JYЙJ([~Z'uRJgm" %A&Aʧj'̧Z ԷȧH}C; yYF<}<6M{pՐ!I.II<iRyx;G|9p6kqƘdEiׄ[b"-rޚ.` vh!zO ٲST9[ӑy,Oq#ϡ$6IAs2@ F? m֡[BjXW}_WgzOHBsٻZ}ݫ?J}Ny t'gTTJ._n(6{Qk=jO;R=2-搊R([8T+ vIZe+N\bKzؠQx@HhK0֥pG/wG'dVjޖdٹ@f,d'J65w:,ľ.Sg۫KƔ@gkEgDXb鯆݇$$A<=(w_u9&a).HB;+e!H_ZhJ9I);X> ;A@ҩ@_ak&*0oJIo>Ub® 72zɛ_ ?s>_~<~Ncw8!|/??Z vFF9EO)Z2ln?o3dw F"+}:6\L=ca"܍򟰲a.nɯCdzŋZ=Idh: N~Ϗ>' _{7zލ'㏓%l~:L ?NoWVI<|*i.X=m.F &L!N]`> jjjN.ĺS͘) 6P"9)bbVI0Xj +iA[+6$2/ocuqb1S16)d=(5;scSgL]Zuh\ }D㱵3q^ 3fW Zr TBʺ8CP(ZZ61XnxIɨ l(I,m$9`̦vC` LLphti CL.4b?h~.fy={lƾCֿqF\K, b4xU Uw를hiClbXYw tVsR(/mi')ڥ"J P(1{2TA"QZC9iU!v5#g;Rc8XGG;E O#-,t__Tb[uQ 5 D/(THl]i![e 3y0Kk:m^cyo[y"y49kEMNNX1>裰lT۸E˿B$Q)a<Ɂ\|:Q =C?җbLBJ/dWDm!-Qa/h2denIcBUW=?]{luu6y[8ob>t7]+Ժ`uvO/ZE}R/);0!9%t Fpz}k0,\NxU+Xf٠5UL90dNxԽl*?Y6:xW? }g/\tit>MvoI>׻=N9&_*%p3r>Gt⡹>3Ϲ@}x~<ܙŤŘ";m}!]TuAfw b(L9VS- =$ I!I64C(K[V9If5eQQgX'PbPtrE*t*E4,\BؔiFΎ'1r22_]N?͉+fQt}&.|:]~LK\xQ2uƹ(8e* 65Җ1#! )@NJ Z@hӁ$Ѻh3rs8d<*p.=[<<kl;O䥰Zcg{h2I7_՚㍆:wL™jbTPD}i].3 ]32a&Y@L.i9e6jU ILj@ m`xIxkA>1kDď3 ƺun l $[HZy98O+&1},A6ڑ )A%lStt2\Lڂ޳uCR)֌fK0ȖD-Lʢge(i=ݡ9;M Ԗ!-o(bNda)r 13A.m+;t . Ŏ+\VyΛMRz*5 PPQfCDʽz;&塖*GYuQXrFbƧc$**|4KIcGcJ5{@~Okߵ;pȟuh78wD=u|ގ7QzN{KWR@FZtߥG䤌Cɋj{#-?7:`:~Ͷ{FӇq_QJVz"4b>Y]AA1e#BVCҡ`08 ,!Pe!C%R5mC+ Fm!u i"g$h5QCR3"u2TJw7#gpL6r|βE|0~2OomM[wXp״W^qkt_s5e}Uf\+wLFҦal{.C6/]W} Ig _]yy~Elx8f _|7o_|nʕC-X=m-AܳnYx~E&rk.nq鮑5oypWgCGIcr\OJ|S7ءC`&=z>J5vH|F " 91~zX@TyŦ[Ӈ,qE˰[Y :[rO|>@|t~Ӟ@v6\G!IZ&BVqoZJ%8$bafuFdG%E) &z+J27^it轀،,Jo'v>o>aM12fŭRYj) YGND ˄с/9JrRTǫ1GbT9PRLt:HiM(,ZKfPy<,l3X * okmH_?v~?] 8ncH(W=CRđh)1`JiNtwU*![ N47nfak7 *7M>/_%GM,&8 'm.a=)4rQ,'|0dd,6`+5&c6$)9OKl?5+RQj/*ڻVׂ`S de rNMFRˣokf42 pREm IRihzȐ & 3A!pD$# hTv#iyXPY0vE"[JDZX"^"10_\(-6)8k(Z匏> LhI3TtblQG~af.:]"+,Y/{i.?O{,ʫ;Px}Zo=GC>)ZRUR )M|r80}TWg:w91dnsY ., {Hj6{p1 j: YLq0F2%v}q|ȿMڽQ=_5`^.з|34/v0"-m}Xs1.G!vϸ"ìɲoz$PyD"90RL"+qc%jO8KNPX g 9n?șgz8,oqxri5kaOSS.:x-3[/×59sg3esh _zxaxWZZګɪ)>IC49YHۖq>o,R:RKR tbYXhEpDg/p葁*<3 D. ~.tx`:sp$-B1<-U -V6q;]<1G̚eHl&oO^;eҚD@'=}-FnCӰ0FBQ~a[qR7epAE^P>+192納2 K!ZAׅMBBkW]9(Slm|I¦˪N߳r>@/r\?`z|ϪP`F*ᒪ J!:0f=~ϝ D߉ ޝ0;~*ԜT!'W\3 \ԁN. ω)=:$ST'PD) = . b:MeD.4 3x%!`fGmiҧeLJv?iXL˝'+T:Gn!D0Y˔$%+J1mdRhx«EcEP JiT^26?j匔*&R7hZoWwBm,A3b|Nm?݌C_J5mHA%R0{%%#p x<. <n.TvqE()Z)_"%# RF4cZ4 'qI=&>~f. Ϳce4 n'J~}|M󒮿5;n #68 "Zy䄾 !WSHhuTqсbD4Ea;'#j.(Krt甇#7(/`dŽOh,G>#r{Z5ȍ78pZ{#gy xɧT_M'y{cX?bn13zy:99_\_B|(l;\>qz$oW's¿-oƄ~?Fփyv៵7WŅ71(슘 pFófd5`<p͹vׂd{OBn颭ލݬ/wƓZ>QZ|2,zvb8L1x˗l])䖣g١KK\>LC~4$ :Gt0i<k+C4ۃ0<_7߾.ޝRfNwoA'9J,Gm$pl~ޞ o6sa%߿owo?/l0a;w+,'%xfC# ܺlR*AT(*1`$VVKQc`pϜ!`i<w׷(}rV{6.eX"CDm"c( Dh,@{Ђo[1 B2yoSy?|Ij((כ *lrHoWX t= >x0vY~DFGo獄,z7sd7vչ۪;Q;UD=#̐(;Ϙ1(Cel2 FD9QeH8c<$OP2uPFk!(pY{$ pփ˘f ,KSqC}yCehf aZjoJar=&zkUrn/+O`Fi xD@EFA$"rBFӤ3ώFJN4ǗIS&> 9pNVFJxRƠbTQ5Cu7-2T)`H \1P h(o$E@ ,xve!.j-lQ@Jis. fjCvt;;ƾ~o>Ӥ|s+gDö=;cp /y*7qi%Fcy^E-VU^ G|gIU ^=ɗ~ɗ~~Q-OLC9!D5XjAQ 1&#  NR&ɢ/.%eKug&AUI !#zj1!15gZitbPӛzné6_;ǨM%6TP9Y78]>;{H"hp۹2Cei F&+u.,'Re ™/9&f˞#)*#T謏"@jd0,F\d7*P`}Y)Nhd!kI|/ H$ D#ҰlQ>$ާ҃+봣Z$BBC"FFYj֜I-$BEKSg Bi]1rFj(nՅ9Tm|3(j\kM?l|n>!KO>zJ6ZzxM!1ի h]qB^ZeԂd&[y 3]M ВbV/ ^ މ=݀o{@Nx8AQ.o9;btt^{"рɶq?\@`!ʾǞ\t߽ܦCM'\&5B?J9ƈ2ٻ6r%W<9x-ayZ,xuHK'V˖/-R+n`DMdǪ"#Yha<$:c`P9 :e T 2=&o9RhJWvy!z{>y.+R3a gRA"sa i-ʱTlC*Kah%pA?<:Y/ q'-htx"F!5ZXe(D@ IZV Ϟ `!{H13Se_4:J݃/ۆ^D *˔e j$OELefgwtޅ? 2_;!v2Ks#s#,/Vm6Kg0_'Kz CͥGIiuBuC&H-*cѣ ZpGN@[B:LVfVcN  Z82lۂ~;ϗ%WMvmIvmlK\Qa3JO0Vĵx.0'ŠҵD"pFpEKB \i;\) TzpBH`+ǻ**s.pU5"% WoܞwE y>"Z \iqpERJVWA{ge#y4B( rMGqFӭprqrG%>21+&&iѼ龣 (r4M}&y^/Zu,,w@S e;Gs;Ro5xOb9 Zp离LA6_뽨OF ^'/6> z_D,6,>O_At)P(NKO/?(7'HbcM@,;uJjlhD{ ͝ 5BHguc]Mu9.9jfQ:C!Wgyk(&89ErE`u>0qX=i#9 U#&{N|Ϙ_^&}m+^ؗIˤT#K<puCUX"s"c+pJ'iz.U@/;+5fjkwdW߁ҍ5R)lϮ2dMEV:ygYysz[:zB!qR(B)Z3Ǣ7~F!R?n gPI`J_Mڟ?gmw]OУ Jo|JfWEAҿ>S{__J)6YndNC9IA9lW\*{!$)!'|"Rj?T"/4c"SBd  g*Q9 H%nG2\,hmhm`GLvxPPqʽH2nt! 4M^+>e3 sdT>&B& 5y@f9eV &yR815лO_;w&fe'"}럛1bu-4;_8D;wN &}Dlb ୍a1ȐQG<{}ytqrٚmi&"-ǓEڠ .)>M.%\ۋ4,]n cYR|?<6 rQbԯv9Q}IGlWg2| B3&[)Ku֡4଺SVw#B@y2Ky@x2 +:yIɥU#ؔ |L3њ "GpٓT^oIyCjݾiݬZos]S}4eC-N[v>λ=oܮ@[/gy;: QJw[vTsRo-CAȵ{׬С-وjlJ66A}4!5d![b: /W%3}3q{h5sVģ~imǦOUs9m_,qXg_Æ!?^qt&hi31+c$.KH #I4JCN$)M&Ɍj"ru d7 %%h7a888M4}*'ߚkZB$i^|nɃmt& Gbs^-=ad䨼G /@7-<8ȓ,̊*m!& M1{ʅ1f' "ѥ9KgRԠPeq0qzꥼb bI!{.hX=3[Ejиt}-cAs*dPփi)K|X_Vq*%6.Δ2 A`h3C,1{(qv#Fp9L;DmɆEՃxp:ɨx*BF(wFP"Xp4`W:iC>YiGEW(:dkbbA1hr .ɸ g7AʤcAPD*"VDzq:gRJ{ VXgY$DFirs"AgqX&36b㤤r h$g|eI CHK#Hٙg{Ss0-9+.V\5dguytYl9>s|Ny}t{z,2ǟ*HX#h`Mڤ\>]!1gf]L叵%ݳu7#&63L8>8 >"V[q4QyjH]Tx.>.~ ǫB+ZaUjģ~/Orq-\StQ% $1W7"Ir*00HF!\m2 )~$\: !8c{&S(CҤ"&'[Z9NvxP2 x&ƅ/ELPcoBw>9 >\Q=T{<~Jr k̾63+!'߉(YIẐ#(Xg) E?=ܬ8>2rգ-e#*CbV,%|Txq:  3jY(ѷе}! ENБpM뵿KCx1.A5ø7))p5s,h2#"8!kj0I:crb|t{kT[lnf;sn)ʼnEE;huoVԞ ';>>3@{q^3j3ޮr pRs8۞ .^;:Hbȭ.%E! mޡC/"_)qEBseqhٳMGpgn#k_$:~J-GgZnhѹr޵ۿBKp`\8MHvXo5~TK\SdA{QJ4( X9tU9a y̘d#|pIheC>gLZC&[dbj|eMLJK64xY,5(2GeAXmWсݘ$= Xen7 s_+킢{ңhKCe^X7YZY$;ȤuVwrq8H.R B@( Ilji#A&#ւ9t%cܩ,Sha O yɂM}ԭ &9ρCiMKp{V fT'zB"\Zo3GpmgXWQJ|ݲoMvH&8 Yi<>G88]Bpo2׿ W HaZ.;{?H3dClajMgRv8)P\2Lp%;l);~208ۙLI5kiGCr3H Ыh\]`ltFُrx_˗'?ݏx;i7zZ=;=:]g %w'nr5xI 'K9*#~]Z._fW޵ޜL]fW:c-;s?y?8ͬt3 GO=pZ1%%oh]3b}36Tޓ 'ZD3Xfl1ѓ6GMV\꺱rApꤸq!!aM篣q,&^QV|F3o]}ʊPCk⟳˽?>_~߽|f_?nyqD30%ig]w~ټL<ǥnnVO>XΌC/ϵ)JA&Pi6T/(N}̕ 7#Ju4hZڤi`v5rkvivQjUb#B{ ` לc^/n.b&ψoF&z/U1Kौѕ ZB È=HȰJ׬FzlgB4|Fö m*2!@9[MPreXT^P' $!zəvu&dz>EywнևpysΥAk`U yɬag^C~,fuش6(jA ǥBFIgj"P^ע,m\Y n2K_] tҼFeZb ]c rJY <28qϳ hbp.JJ2E3- |5:rPpioC-g~;V j)WvOhc׻rP}?O5:Nqdw ӻ|tq=(4 j^t6⧍ Rq _By*~. Eq*oOKg皦] ڟ_-_D'OG}:F$&29+s~vv"p]@Ҳs3$m Գ mQTg+gvq;KT-x7S<&A I胲GCEu:"[lMrZKG|\{f~qW~n ,^jo;Dߙݣzo7K>2{3,\ӣ;#%xt/G_^s ʘRGIJtF41ܨ`hȵ,Icfߚqfj$iܩ/DYEZ3#2Kp*EJd S",)0 0ZF&1FPTtзF*őT/`7(%cN+P9m*(oƮ;C8vrKK?σp8u~.{#gV& l_Q`AC$OIɍW:g ^%wsBB$n\*IԤ wʗt"O6I!gZY>ʩh어0:qZȡP6DV lKQd AD ;ٱu֝ l6ΊoXD:)8DdNHRhBz!h> %,'DMܚB)X^EK8})7Oj+'uD2QL$ &;ƨl`EBΠVTKS.p0[_*lI.A ztBQf@+W*VKv&^Hz7٦5`@ZހR+V 902sYJ+ :Sp90][j߸GVKPzѓmΖ]Ow,|˼#'#s ,ה@&kY'&ʑhdB2,8CHxt'K0ٶHGg+a:-:AO&Y9y-M{˒_ @-jl)c{tE> M#t`E^RqQvDh$/j} V PH`9A%k)CT"X^F6څ|2;sh0J`Rq%d:gKOPqs`eN5CkW \IpN?K?K?d*!^qC-UH"2/y}<;~IR(DסX.+nF)f[#ٺl 2[+)%ӕٺ2[Wf뚤Q+uelUr+ul]+ue֕ٺ2[Wfj'Ufl]+ueV+ue֕ٺ2[WfIΠ2[NVD֕ٺ2[Wfl]+ue֕zf֮Uw^is:GbJrohz)|c8~CS5diC֒R ^dٰXhˁ 8bVunvܪ,MLIY,$s .Y+JT^k)J>v;#5҉vB9Tmxz=)IHecRƀN-FZ*Y=p ݞJ8sszk+,,V,u%Ldq WDx텄GI^k*_@bOվy";3WMϦ^u鮃Rz n{K^7XZ tḬ$ +zr0b60(,,S4#cJĤ+J [T0l%K1^f@R2-sdA20s'(`uDFɸGruC)[cc7c.~.j?E,sy!m1syϓ'xؘCKt!F6d4^|Lyj`5 ?~:M 4r2G 8"f!rs<)3]EXB n~ %L( Q tV%ГekTd.^ Cb&2>31;!x셀Oڹwy'2ޠfb͍2=ݳr^wDgh)P{0BpAPƤ"J x(ǡ*eةGu387yNBy@rBdHOB#뺼bgݹ]Y/@ǣ4+;lZ7Xkl֢Y HunYWvh0q3;4*him&ݨBQH3 /n*MSrʭ]Rr!KNM >C&J )&2JF32 ɡ Y+%G BTw $S8J[j>8YdNLHa–=&]uv 'F|<ي& \oJƔRT~Ul|$fȢ1BI TbGqM{ d %,5%*AJ&X $<'ǒ꺦}WmrTII)eELDCK@߄ 'I&`Le)`IT ]mo#+|g"/,as[,./_cWgv俧ؒlcXli܋=覨%0[,&n r#X$[Vk$с1Ԯ!) Ζʡ`6D[ 踤ۮz_icZQլj0uLBd]J Ei|$fl|:FYcQ MJ;"Py{ [{mNֱ ܁/4 pPy8->Oӛǎ5foPEt.]o %'1&Tﻊt󩓯N: U{}5۾YN^\LB{'v`Z/Ĵ?D/|O7E؄uZא]cRDmp ZC{!Y&Ʌ"DIHT&[(%@y.ؠTݓ% D 8ᢓ(G-U$-R,C,U`glf进c;M\]Ji|^{/ye{w9]>5EKa›D5%1LgKW.qnz >լsm3d>삼E"3 mM߲JxG_Gwћ9nfҋnȎGv m}Moo|QbWFnևu~~kE/e7mGLWpmVz+\^r讯Z4*A> v%hz}m؛kg!Ϸ_ oYDE-Ig=YmtytA+',׺e /JVvUk/K:8{r/]o?=haƒ-Jğ<~gV9  Z/$eYhcզ\!fPڶ}!-MDX D:XPDZRcW\3q(M.pbirMZ|ɝ[V_wk^-ʓjwIv)j7/ܐ,Դ)_/*[^壧茈*Ƅh-9EI`TRE9Pv`)iTu( R8;$md)/4c_, XxP,gv]=kO诞~E噇o|@...?_?sgFlXA$%R"`2d>JTMXmeGmrT!W7W6d*-&P"DE$Z"m%vĎ㼲<n{b3޷/گ`HgRHYVu*yC|ڂ<r:jfSؐCFdf(5"XI$.&Q|+EM5кV3qa_6Ωx,L?ED倈"n,Uos M4B/2$fD6+G-1J2Cܶ@wRc5 Tj(gbi2dUR4h:L^4gkud\.LKEh0​k(R!kLْF[PHv4$_X{{ rp x,xL;C~xxk ~_6>6qST2Fj!/?Wo:跩}l),t iDׅXߘ{OJ7ۂ:/ADMeV>eN2.{%H:8Q΂`i/z,:1el?#j9yߵWy4pSg1?/ o :TH^JP|D"./x^Q $ˊm " +&Iό&m%5SRYZiO;gk Qf6!L09#B̻~2OvH<=<[0u[q·~|y\/ ?T^OiZ?izwXi)6OHAE/%2ⵛ!~ɾ#Mx޼0nT>PQ0R葐=dws֏/|7|Lus_yav~P{1O~q>qv<;gfדt F?Sq= Faoc5Yv8ry=jrqy2ZЁ߿ޏjeOON \Ui?vb)zpi'W,0Jy2pUy*pUJ0k+(n`W+|(4u,T%V6w~?0H{TFF|hɻ}Z|;);GBlJ&~zg'U`NY\#T`J+~\77ӸǷB/W գĕ7TG GpVypUdઊTJkűU^#\՞ _yi8j?w٤w|~(+jQ|a \Bt"BRE% MA7*ZƁy"Ei[wUKN$ܩ:QpQsW8>Z*_ʺnEO-U*LR6裡.]Z1֗*=d!O4GWsUPsYWm~:^1RYDa e\ 뜌`ؕ8E33(YFT*Q4Rhqj-! 7#Gr P:EKi#ꨃJjI55gN̳tLUlW /%_:V=O)ߛy|k%Z%OB`]K:KӯOTśvIE7((=k4Ac,&*#k`,`6e9z*lj*Pr ^8@I%Ic)Y!tH:D ʂUp%+2oj[c ز-/&%(pQ3j")jlgz7`f˔ Zkb<-TeX"^dM6h#R <*^YrD@MiM-JX9@KwO QQQ+I-$~]ٗ *\@4^/rFչ_4ƣkg^2^OM?{ϟ_ǗkRUg_k}[*i\wY?Ƚ^꬧bפgu]옿5j|w|Z7޳^Ue [ܶU ?Pz uxGլj۴ |_h1~hŠ78:DbЉ¿I"j?:|.> %򪀎US}S}w5/=i)c$($lRXR( I@Xwݾhi.FkL ;R-5k]78۽_Q蘼_{ƨhq}J(^/y!ʺ9A|:|y?Q>5;.l]>udRЅ\p waqgثwekqU|;i!y3s%d.L'T ZL 5ЙB("~%-QoLZBC VTC rTTjmR6 Yi-I1bpj)AX0CVHĐi ^,&@X];Sn9 tcjW_wL]~aSi=7O۾Vtvv3%cM fǻKb&L|U;l{ŃW_S˕p~04B'?q2)dyC fR)"VժZT.z.lLPPBʕ}ı6{L\ަ!Tc][pѹz<5[S'PyBpQ}'\Q["6,֕VN*(R|n+ 3>:`O!OMq/={/~;7d0_P1`O/y[n/^r& wrǔrsfugtq%& +2$/+e0rCG~r8~D(Nk9TYƃ՘SA :_l\!Zy;:LAd]{=&T{k]'CrP\]j:qgX8O>&,bKhW'b>1?]/Q`?џw!l%`-F(y5%`NscZyb0S0Ú\r*j%Fnyh0b>tQ,[A[h (Й~v[qfHp+!s_\/.o{B_s#!~[}ڌs^DmTle6L[4+BkkK[Kl PSj(S9S  !c[rVޏQfջby+ժ+;x{d=k]1YdYzMܧhq5bNxz$.pH0όy-9.3"L-94+M)X}6kK.bvB͕XىCQžDR =TVNV!bυd0{Τn9 &sHٷeK#]wvLg+.p}Z SimD❩Y W=g&JuȶsM{&X7 hS3Z,lX:>iMl #'d1RC-Aӻ}6L2Mie܄e 7\R* bR,b(=)B6ߑP0 tC2;lVEtu&P*TϓIZEm߲r11 U-†vJQ؜'-8<=iwF'VspMgrҎΖZ^.ǹ˼M,ϟ8D>jȶԣJFXi01A:XKFcc|z;>K(},S*G^A'l];,_7ۉi=@Hg6ot*U2yjP<BvC1@)5N&)dw!cX|f9*d0:hEsSpgO92[OsItj*Z $BK?S`kם߯.Q<Ȥc.᷹PF2rJzcɷ%.NТnϷ{OOy=NWL 0yG{S6*1)]}*>I1w,6&|tf<g bSf9|Le]2 &`Qjʲ}&1kA#jA0 0֚M=z :s@E%*N蘣6(UP5dm1uxdMX,᎗տ.[wRt M L`(hHs.5SJDo6T&5dL~*1CfDIjJrNYQ-縢źR^wC'{'sCn\mm+*cs2&hD,Ҙ#TL[T1`PAWl)pV%`BWdRjJSMz8.S!a\|&:.dbꍌݖsBvX7 {Ic0tB?aYi[c{on^~_m)_fg?gOۧiKY-;oĬLHR~bq9 ؗjL؋jlm#Sɚ59 |hB[R\Uvwe^9j^cAnH6Fڏm8[a-BȚJF|DԁtTCJ+M"[7nu1'Ob&fF V"V`\U**Q..m9pm:4` "v""tFDqBă]c^MV9ATF2X\2P_3b3 .> ֆ3 [4ŒFtt`wn9': ..wKcuv"vEpqŃE#h)\r};H€ %VĭͤM5|ּ>X҄gűa7x,xg/|awMяhzA7ۂmx17ry)7eNٱlD5|75y{Nm_2>w C䇘jρCLL1o4ڏ.b6;QlΕAtdQ[B@*^լB,ockޔƔO`~.ovRzw8座ċ˼;f{pKq{ҟsy:l,"ThҨUD d\* P$ ΂Mb< F9fkݵ% @D-&lE4Qjsѹ9"h7\(K1*PZP1@(i$QN"%p p0{(ﶜ0O% ˆ@~b݅,O'ئ{x5vȈOՖ5|c%הCˉBE.*Wrp.FlPƜlRkRS^虐!%:tي#^G$6@d0d-rfĹLy]yOŒ'8.9^Вs xK9Z)NATݴ~oFkeO-qhdآ~.1N|7u -a<9fXH.xkUT%#R{p d. _ :#gDN*UJDږCѸe6\-dat%l7ȻDGݹ\t'׋˛= ϼmn.{ : q淓ɘrnglqˍle70O,-e/CG"z_>_nn+49@1xȑu($;*9\^3,0#cW* M< "%j(9!:yeagy)jN)Y"TC ' %.^[*!I*cE>BPJR[l<3@ tz5C5axy3ûջl:VfW9Κ+5F_&Z^",p wEw|?|9nhs\i?nh?x];e]\tټ>6y^LQ Gg咿˕@< h,Y5kmHe`{cI6؋!S*$%^_-q$>F5IXawK[3@ CťCR qBsqȞ;+}YC'1ng&+m:dW,oJ:99_^/+-(h%pVQ|āEÆkX'.҅óJ-#L~VZ4};\kkm0 B\KW~bҳT/zv9)]kPUM&tYU Fj;aAOЃ*>(_f=ZٻjnA:kdUUm(Qr9L. bl0^~K3~ćIab8s y󻳷o~|~?0Qg{za +*([6`MEǖfNͱZJ?f\oz#vJ_xr/l^Z"|&|RDe32:_6UMۤjYg`z9nzݥ/!Dib4ǑjPnD P6@mj_mQA_|; 8M7g7^ts foīꍸۯ9zNjM:YFKRQP*ƄbEx&OW$i5SMOēTKpP-+XUUU^ \ N-$+WbJ2tJR^"\Iʉl\%e{UWR*bM$%f\@Ja @`JDk*Ke[*Ii*I)HW/u>K! t*HtikVOepzh?>?.ƓW=fgc e` 1 RŷwoVL,Hן1*{62-%0V> 2g)zePc8isy4YsF[Ƣc\8ߢao5LRSaha+ Fѧ6 tk(,S%}BdD7u 9eS<`s-:OwXLjAQc/p ggjP7eVj>MLrJ͙"<]Diw14c)ZdɁXrI\b%i䨳KNl8 )XvjzXJ'Yp%+ծC9WjǽJrW|N{(?tm#_Msh%??A\X[3IKX3IO"Ӎc-+"5p2W -'pTW4;&V= ՇNZMWT*S "i3-d[ B7@JhW/J1@`Tk 5 H+e}W b1*>@d &~g;qUkABWIʦYJEp+v-pm>JRng%•T-5Hqΐ>#5gHQVygvk.UvDA`Eڳ#ĥ$-M$%L@ZqZW hLl1UrWráGvְeg;iŁ +\vzAҎhP#e&ٕҴR׋d3e1M/rww|3Lq䞋irƹʍd:pg[P=FGP\廀iʧI|ןNJggjӑMZ*[X!plRw 2[0 ¹`6 DZv#BT>]$9-OK~ 0+c}|gPW'wO0d~w^H1cݓ ̡ ]1D+K܄uOݥVsh\>IAɝ1gVܘs9S!&ө6k:4Ϩ2OMƕ^w^.OTDt%9,թ3ۜȴS*ELex.iw=13% N6"Ju".7 %%c|^tNBJAYX34!p-g2gNX8RLڤD48KqH۠|=g˥x}AiQ*/;Hr,'ۨ|bpXƽ8T847V3L 9 +δ"P5jBTXE26FYE4V^|ժgEC%Dha"DHD"#fKXs$rŐ* f MkֳPϚ1[ c8V,H^m V+ GOJ)$`C#90hH "Ƥԣ> 4E;j!jY;Kt^L+yJ>}eΤ.@U۱ htK۲r47._S~'?07tPPSJP$+8w=M)6؋SV4SbP{/G_RًHTʬW3jXђ+x6h G˥݄a/~Y>t}H,@Me|_81li1(u1s;&BLN!O&QFi&AOf< 2'.|kHX̵Q*g V[D@D.a1ax6De똑 80aø17f,Ay5~$3kϻ?~ρcDکO |xg`Kmv9#RHQFXPwB}!Md{7v 6^DQs 2"(EBc ^R+|7‭iU"bK(F[}^c/h &q\ndh7819 aOa!jܼ5;YcjD%u]%0?2*O[㵟7ïHlQ\!r -%&  vqd [)EF0+(ں:56zvXHǞ "k/%p`"7Dd%0l\Ny&7ᛈ-nֳtf2qPa98O֗&p#+=~bZki`!(+q b` XWa2(pU씷坲o$*~ښ2⮊W:0x./ ] _o;@$hOuI|T#"L#xby;+i\T5߼UVpG@b a,*D 1PD`Z2n[㶷9~(O=yt1u~t *((f#XJK.`];{.E..觶^ hN@?`y,z*B݉~vSsVtāie5sH-a$}D$h~GyI;sz@ɺZ-H֚~AF7pTF} "eJZ) @8r f@IM\rNFSR0Tio"H턶B$$i"i:.J$[#15ɳj累5&ESSJ^$@!`dMXYpORc*sLuHtLRfIoQ(=h%5: `I@ZlN2=bsBS:q{fhj _cⱒQە0Ea0,D !(s_(H14بOccGc3xK~/k?9p_.iwzsKs% s_M2]c^K2[Ϗ=7Rp'~KA,< ٘N!7p𩡩HQ)EJ-kQƇr]⾫$߷ i[>wutOV@OU;;]~fG{{렽zR aՏ@fl?83ng(U" &)e>daGkAu .t\@ŝ-͹^ v)0W1 c.:ST\rITEf)ỷiVۻf '7?iםZ7Ժ]i,[s'|mms{:^IkteVsC寛zSڑjUWsx883>Kڐ-ͭ_H79ĺyHK1byK4o΢k$@ԐT14T&cPұ>)+ut!Gk!lAERكsA YzMkd`xi:sU*{dSG:֗W3zdxxHF`JMY[8 U"$jOL`Q(3G%b&H˺u\P1&rC=N5/Kp@IZ::EkʌZm:fmY7Kqv ycХɭiн.3͆f_o|U5!R^ 9aM(a>/<2;oH3,JȄ U8S12.eAB2EHYp#ZL UM`jJ=,EB_ ]OnsE2Ȼ|yz~=U櫇7٠ hΞ]F E$X`:S|R1K:CWM ueYd({!URؔSѨdIj9eJ03v"Zُq2N[*池v5x$j #VCZQG`y0(u.qgԹe 1k+%c`I6ɺuf66@tèI2䊴hh k20C#a=eR6Wm:Uוp 0 "V""GwʰAsRkmaVd/},s$&Zqo9 CT7Aw1cƨ䥌* )4hH`S;զs@9sO kϥcuV⢨wpȹDEmF_;"I@IGdBh,e&M ]{mAa5x,x(Oaݻ_519گus{?^`=o2o ն/>UpzgigQZ&hcuH.N~Cv5ܤzڌO',%!0c >BP<& 䒁 e`>85VuC4]0.?'z.v,M? ۚRtφ0Է`bGП|v6mudO%"ődS F&"dvj 1!?"9d.jN<?9+=[b6`;5XY#̿s1V9/AfDe:3,L Zb=lqzzzސGCe`H!y1"ׂ DM:Z1Ze$##O)zQ;G{//> A)xh9wF3`dFX=gŐY.\ո( ӷz[CCM꫄ښ0:ᓬ$/F{9GyBA[aXFCS^G&&3߹#OWBn?)t=lZW^Lˆ6(cNJmSx&"%,myP1࢔d%&-LL˭%:.q)`l&UOB+KzЁjQLo:=h`t9/9h&Yib 5(2NX6[UPy;K1qb4RW br_ˏ-Ah89R̗gC?Y&e;Ri}tAINԝfe]/3QhbK) E~=ڟh:d!r9Z0(*F" S29݂d&TcK x1IﵦqU%*5"&.jΒ", <+Oh}plfw\ ~[1&FŔ(%OK?槝bo}h3m#,6cxё?qs" ^GiiL}6A|rxH(&&8㒉);9FJ.$xG'A |zI'q!gbæ-0cnMN7]Fck},4E͇$o`['LC #mA"^k_Coޜ..ۏ"жd8Cр-LM.Y s$}Z4e? nhzl^LkD틷Ճwۂٽf\/q Η;+nGwaIw#]t #Yx>86'N8D;t i2]7z~3bص8׽8v^dר]kDͨbn5ΖD>̠tM0AS~=wE;5Dl8^pzNǟʟ~?Ƿ?ȅ/>-8Xba:隂T<_+o6Usvk0qpz};‹ MIW08~e_nubU-_ކ/j M㭇N20f51KǞ، '> GŅ%*/vGeґӆk\G& A]oǒW1G`,vmc_/`}}Z%R&)N}gx )RM sUw*"U8;dbAcqrFQbQcI;鱃 RV>=وpic@*%% O]( "DrFWv{:Yt+(OldܟΑd4~߃ 6;;{i; Ni;K`U$i; VUi;cgd~0},;:XEOU9r-o"')\߷!bwN}p8Nx*؎\Bq~vԲsävT (eչRhJ*#8G4g͟Uitz?TEdr@ `*MזI4g=iy=õm-9Vaf:-ƧT_~:9]X|`PE݁]zZ?X"?XtN ;xcSRG=kgYXs7.ɹ`1X9\] هڧcH֖R2|%gS4aw LW7wmf+B6Xɴ<@v݃#Mu1F: lp6HF_jRz\T)b ʮt#;_e4tq5; ̒ EMgWoAP9GE1tLz]tBV-}6z_z o+Ѿ,>yO#ޱ5磹̪ ;7r{ԗ2TsjČV^+SyOr̵"r,a&r9;0Q+ɾ{JE;zKEg ƭs@T깼}\ % 7ڇ^e2b*vmx'7\"ͼ-LRAA!+ Ç*is7<| qڏ"|ثUQeM(&LRc$Nj+~|wW$r(,fy0XWuTIXj֏罚_mqs?I< Gq@^4lu5+jV?Too7jAR?L{p6|/:.8Ǯǿzz=lۚkv<;fV,8av7*N;Og+ 3oL PΪ"TiźAr=Kn9~lf/n4쀰X"b\!%j(gOXR@,&8j)i[X`tWY'moLV˸n 'oa#Xhzߪtkqrї7 Œ9K˭. p2U _RqXt">7]N}sA׋A{9V0#*)ٌ},S9%#r7 `5qk VI]IJ6Y:R|; JM:OVmG˨,k&VgMT8CͰY#"K |wӔP |ǟgZZB F>> Oćoʇ3WUSV;^cUqҙP=Cn<^|2@9::|W!qʹe~Wf蔰SorNnR03wOm]Y~ y`tKDc/́+2YNA,0r.3ǔ.e.4He*j[EF %X\& (DfQt!ӗe#TUDy#p"|";MoLDWd S;]jmAS4rҮ'V^ M36ɰׄ:q~/C~KHzd40.|6̙yg056^SXb0J&=ID!Ս3q~αfe"X 4t[ݷYeMPʻ}ʶXh"%;S"X\FT}TvܳT˥WLjbJː~Zu0j)~u0*/uT6]zL9=uPUV}WWJ;uՕD\EjAztŕ [ŊTͬvՊkUcOwG(FdhJœ2ynsIuQ<1{U/xNǒ@Ot"9=j3l`jjVK WޙI !0`RBF&’(_dKVIfB *VBNT0:K)ks~M#`kJ%+ryI(}H "̊PXaN8TEה}bߚS?M7E4Tr\H%D3JLuT4)HpQBL"#yH+PR2n*P ʂEЄHXZ,3"Հ?n` Lݧ;2K/+/qtpI]F-ZXLs[ʓ/JFaĎ0 'ۨ|b8 ø> Tk{{w^r˔O;9;31DVhf4Gk&:,8˸ʡrfP4TJF9`Ť%B"$[ΑC*,9-lzcèpy_:;2p8>,H`) ĝcEBO?5Wi"8h$kl`Q"y#^jeZntI(r_z!Ʃg!JK3"cB)0NqdR{6O:+{iϕq>y0VcvHg3%"&VQRƌƁZO'bĜE NVFKQs"LP4zq]; .i E nrg]l-(qQ? \>xv1jVFw9n d`F>xk@qհ…ɝmꬆbkl9&$[Ur>-'so/v[MY+Ye O[EAz}:`N9'r5:S@u O9'*IW%r&Rv?%8,0(EepC9-J-AƒE ,ӝ+X;X-X4f?s`Gi ,YO_hñI)/,z9&r6MABEw9lVJHPH`}6r>7L[9{^3F{O)O}ȡ yGO_n;O<5Wg>R,>8'8Kd;z/Fڕ`WJIi"UFa BMA VdQ鍍hAކڋHU8j.7Z ,JK-='e{l5ҽQGGсgyA̧Gmmr>,?D.=dZ=9?Q)qSH${ėƝ4cD߽+#$ZR.-xX`O u;o["E1!{)-L rH# be}0z)#"b1h#2&"[FzorָLͷWGjp~3&/31fhD4K]}y=QaP-e%J `Z{E\`Pklv(Q{*NcF1Z 4wFmHݭQ1r'g=9JW'ص§`] a=Kp,׶*e\˴ 鞱=Ya|o!f$TjVa?[w"tJEVhT׾V]jb_04mٻ6r,+N]YLDf&f\eFɔJD̿Ϲ$%Q6iMնT&p qc=Ykv4ўdhAuOm"ð :/fϾx>[,:ܰai'k|mvf%\?3Oig ]Y1Z\펕28Hy?6k<45# z~W5͑O>X2{}-p.ʢt9$9\HґLZ[O[Y7  [ڇ&W@0oכmxR],K`IN$I(+yv/rLVv&leViFujgP*ldZz!6e)Fъjj|_q7;[ZV ךj B2Z0K:dDiL QoR"S#)Yk1[udy7RkRJ;߬RxST&K3D^MU#b| Y3)$Y⾓3J3Ґ!QClSg`pfQ߰ZfY1+!҆m^{e( oR!TR2熩BS ޅbUVZbxgv_PGBF GF] (93iKwD#C"jUysQEҋq$Jyk%(%dlc9$eNJH ~ Fog>jieBVCI0Z4H&ՕlR\<@@YcOIݵ.&S14J4)*lig DnJ3HTf.#+l|BuT%(x󥢺,vD, `-iЮG :8U& A5Z,P|պ %,۬%5,+b 9ck50DCFKޞ,uF%S0ձ\`0XpC?kE:TRZ΂O@ hEb_wIAkS#LjI"){0N[5Xjh+USAJJ# 2b&T>f65A\bOc{^Xd} vEm%bxeY j3xW 5H4[A eOc{ljk,byu 0 bGX.!_a jp'Q, !Fk޵>)H`""  `::p< (o9@^LAG= %WZH HÌc (Op(v5k=ȗj,(t_8g@j PE"H&jZ1%&>xmBVV8 W8o񘣾y{*KH'*@xQx 2W萱`u+@JTs F*ΊjGVBd*8x*Í?ƿ:IEC&Z"2Xkp`DP(c ޣP.O9g^t9 (TSt]H_>0&!Gu 0Z%lS5YSc[ %@@@|@sm!ڮ&˭fVtPN"(Y ^c~XT$fJ2R(&@?x'DAoqVpZ<`, ]aXY6c;*gو&PB@tbц,Кϭ i0V iבpV4IxFDj` [6 jTjJ1MZ~VDE,o) hyeW[ Q$[5yStВ)0L![j\`M.1ڹW6o`4S5_Ӯ7s3d5hd0u1 6trL'F1I0W{'A٢bԶbX)~ךBQ9O%%\]p31i;:agFof`1]URwMT@=ʀ Bj0FQ̨IO!Ƞ rs‱oc2l9; [QWTb,M᪛%r,J~E ^^0`\8 RL#Hmr5pIFi>ubEW(!Z`4r'>2Op]g튆ޙGF'>`j?ozF[~E*L9<7Cu.ɐDyO[>ם [ d)#aLMIIcqWzo6yz_]cB%iLt$ј5&`И4&TpvטYd}=o՘p&[L-v2% LV, e+0H}q L'?g#gBi 7dr}(T\,|UWjS9G`}56TSߖ=[{s>$ j+|yȖ,OZ֢zo?4Ո/b}G>M ֮ ޴$KT-TdevJʄ1T!%5Vzl+9Q@K2"r"rOdPUJ"7yU a9/K )w$R.f{,+ώcLOK1vI1˷_)WiJ%ɵe]䝛\kSXوX#.(*!X$tVhm0ЁŠ! 􃄁N a! 40@Ch a! 40@Ch a! 40@Ch a! 40@Ch a! 40@Ch a! 40@?0$ |9@07# # {0~Fa #rU/a"R%J‚V~Fo+>Ft%PQ,Stp`x;lTyM3NQ9AF砮͉B3'*o9YG9M'l:.:on/bQusF]5zH{zbڠ?p͖7Rrzmz*.tKJWAoq$eWWW΂ϛ~37._|XNN6{ݞuIa0iq|~W~us5JU%NeOW fbU~>*cێ8n/+ϚʟZk{3o3{ z؎s}8n\s=zG=ףz\s=zG=ףz\s=zG=ףz\s=zG=ףz\s=zG=ףz\s=zGy{οk V/bzam0s +){A˔s5>~cZX35 Ao8hKw7>v<6}U7qohe;þ#Gl_>C'~D!e ʞC IZG5k6JRJGmsu:v~39'tfgٽt{kF-bg~:|:7.7Y矟pǒ~8G_ߓ;ɷeȭU[ޯQg'yд ^Oy[]W0QcNlÄ'\~~z\}M_/\WNrz=4_7ߒsa9pOG[?灶;ȍK^N4Gԧuc[\3=m+rUXEJJ*Jǯ\Lc<hք L4I^n {FfenCtxn6*Cct 1T:JP*Cct 1T:JP*Cct 1T:JP*Cct 1T:JP*Cct 1T:JP*Cct 1T:~^߹=nlEl\Me?t膏kP/&jb#:%P 0nxc.?˖V)`"r !YE@ |Dщ@"p8kw7Lʾ&m`_?i\ rfzr`%_@U^Ն-K`;X եkx/1Ps1_pݿOGv; z{ƾ&_n^)rSzmYo=>P6&j$kTiEIV^HQEu ѣX38|Ԑf;fnόJ\3ͅs. #~\L(=#M|xӊW󃿹`bՀ׋_*xXٻ6$Ww7R,,q.~x\[=CR!)eVeqUU5UOQ,^BRF.qVqp-]E]rt|Eʎ Ҙ09aԏ>dCRǾgֈhĢ7q|hcRT$j,"LsDK2&FQmQ#f#gF|`-bٸd_H2ERbы8^1y)6`yn"J6:tۢE/>CY3Gc7%`~DcQ&9exzM2VʃQTj\zeqj*8^p zA/8^p zA/8^p zA/8^p zA/8^p zA/8^p_z3Kr^I+`+`.? E *S,. MPpIcKk(\cSS,~meg34d}L#Tgj0O ;NuƗ^=ksU"iy0Vͥ}!?oYcex 3Ӑ 8wP[ap1eM1EriicDVtrFtlLcbK/7t@3Eq^p`{`7Ldz}@$$?y 'dur{G ^?D ǎI>PG,ˤzpBuJ]/!FHE5z|o\Ɍw:)0E8SwZFE49P6/gEBEQ<Wi**pMmt0f8ؐ >E XFyhd7L|YT:qš^UiN V(#TaN"1ȚJbgzakZG!CۈzQ۳?LR%x~!B=aqBoKJq uKDHZ+xdK6! Ă˭5x5O +CD†K@%|7#łS(îQ)޽&kJ&j ?;*$-ή@T.Okt^M@Sظ0O[]PIs]$apQ; f4R Ǘ\pn@?Wq2GjE鸽)NjS7Lm XTgFip 8^R/io~JƽƦI4> !'}[8߽nNn@=.Ar;#9 ^2<; EJ&?MvBܮ` .Vs67Њ2[m6R:uyxW!ZW}CRruȾ|F-km~VC 9-=O ^?hFS߿y^&7:aaNBw f 30wVS_7P4wV>:Ly[Ɠc7>3I6vK=h~vi9wȟvle -\杼:#bP""FDOnq惧ڞ4I(3d%·Q4&K=v$5!c+,aeۢ3q2J( ;%vw=Uܖ4V?:5ÛŃ96-N/ͳ.7kZ1p94BB:/KCZFJjhpx8lt+7_;ן?}?&ûyo<hOH$;k,}3pK>ZJϫmKooXx jӮDHqxT]u9*},Oٙ~}G55̷*2lü\rϼ?/{d9 Q<.Hоx`wf@$F-(y)y>IA61Y"UXfEGsMR^|lX(?Āá'Fҁ1E} ZMn G91 CUNw:d'L2Z}?#h>ޣ9L7[tJYcVI!L梁G%ܘ9첀mR;ĬW逰 %/;Jf-`?l]ط#w/\gR,ӫ毗mAFC*tz&>_}4 ߇8N/?ۛߚsc$ 6Ti-f0-\Xܹ]7zJaqdSۨi1O;Ťz!JNKKz4KDև::+$H]q-0GZj($)_i9wװ^ ZI= '󙐇08jAjE6&aN8T. =O<-yS9˞Kh@\]GEXϔP$8 ҈(!E #yԥN()TI ( e5BIK˙eFiL,\lRO:j,d%8_q^Ymꔎh`ۣY3/J`a+!epr_, q#0{QX*SAnp).oOzřA+5!*$$26F,s+S:ޠh$00rIKD$2 HZ8G"WETB@4e#gG9;bD lH.#-q(_{aZDs '7>`) NO z_N2LS܅mᾂ${l`Q"y#^bje3lX'ݫ>T /8xG4mńqmdRpFkcGP VtQAadoXw9 6"%8 H8R֑RIEBq5 i=WzLl==" ,|K0 i`I^)%@И8phH "Ƥ,]U#x,ڍ7"8d_C tҡ?n[=\֐HR2ߚ#,ųa,@NPse[lv~SL%\_ORTz}۪ _L2A+gۦs ~+incG( =}tt Hzl_? IYER4H:XVTACtDIUNI6 8u?#יcY̱_h/ U~ՉG1 j#`rJESQ"1, Qaȉ7Tn,ljm{$rO::Ռ֯(4$מ1*Ż)fq6Ӧ/lKk`4)9Cp}N$`qo.A[5{8M }ySJ'P`촄dKaswaa a;P.^ -R" R(]ɤ$,E!PL&%"m5SKUO|F[2) %]tYzLLhɪXlF.Hb? v+,|fӢ B6ſcm'1 )+ 0 ;ު3ʫ-UQ^ŏVK*8u}ב ^GsʔAdM5ުkC䴫&cK^[nn?o]EJ9+=]trw7WW7׋fgU\A7fwQ _f&s NXćoM˟q}Kof\շw7gLNgVR'Н6:b-bE<Ϗq&es_]Z?LwWxk>U޲QWvE|>/zuJ{6vAֶ\N;'Kjo5fl oߓ؞]O!G R2[)ld-X`$bP0gFP3ho2h,5zBMpp>B!)0B51jw[ԒTbU>j):RE%JK[@AQflO; c;ΰj˪_{KH=֋Rԅ/Ř1? Yo&7^[LzHT:LMlf.tM >`\EKPߚhU`*Q?܌fV>{5W7W{tu^1^Odz5ՙ\O?yr'~-5U*mDg'&|֢ K&>lC0sӈJ鉡ҫ˹nfyd~kd˦c_{兤~9`yI˨#TGr.]qwկ\3U|ų"?MF9rt"HX dAn*{~Hh]nrrF.ӯYI`//HY)OF|2G#U.YdJLgt0AL,؉(r.d4BmSR ?' m$nBNZZ]}=oo&a`ԛ[eFmSfg%@!f4EԊ-RI++qOKQ3H)ҎWM+5<)eMYt,.35">[UGdpJmq\Yhy vaPBr>3hC)T),lWv(_xR$GFؘkUkU%DI |Kr^zWB((s $pleaQ7_7F7}# jA_+NJ.RBE&Y&ʹȤMt&#c5>ߍ)x헎Oڪ Ơx"seإp~͟2Nŋ?uXj&f)QtJ)t@۱] %EC@UWo bz V=B|U+Ğm!ϓk%~Q)^_vb.yw3}\õ;L]tJVs|*?.-S/Oݧ礕?Dn$mZ ͍m묟#b2^Lbs^OL4fs6o|dWFn5ǻy=s]v3ѱ-/P>\xښ8tWCO\,vcbBv|:cW4Q7?n7|̛i>V4cj7TJ76ЦĐ#LLs2kzpÜ:VKKz;+c=_¨R̀:}AX.) #**f` Y܈B  4ҘR=۫Ur=ENY\*ơ,u֨4SZliXJGG:a&oX"T7Г0G"rr`U0:،Y* 1 kٗ*Y_6}e?b@&#YLF%b"h֤FZܲƕE1AVU^8We/j"dMe0EV'RcPc+I%"ĵ"g;buZQ1vǞE[^7(x6ddV&EYV>w*J|IX.k269_r0b a KY*;ƚE2K- Q|+&JuvuIflùRXϮP1"ˆ#"nl#{ed?d\R4 h $Đ!Ej[ =af88 %^g-8}IKTp#ԺN3r#׋KM@!:}qQ6E9∋w bemE#XJdX*Q6h\F\<.}P5Cf 7k7' V%-xc3 gnz>Rߛ.p<5z?:Zib<_/O3;\Yϳ~~~42AȬHmh<.Jr.SlPPL2(b@%1#b6!0h5\}ug(0y*7hH pBe5"ŠeTDwRlSsz_WwܜJC*4g#L.is)͵7^s~+~Js0 H EHiQ(DFVْvJRL!/Kșu// #0`uc"'`0ƶR\1\M 01Ͳ9X)ѐ7gOY;s9ӽ<;xxբcGtHPZ}(b4RDW$ʅ#d+!%1,1O๹]Q腞y(٠-+>, Ԑ) Y*-C"k."rL#4A*8Y~#Ǜz5 =_ =|@r3NwLr\C)~>V}`^ċXfr>IL\}^O?Qo*>~:?Z򚩼wӳm9#g%G>_ZCx+I+΋ͻ}s+S4~ k_pzeoVVjua!=}kwB;ezvyqϳG]aƮyDvD+`(Dq9QY\ve֪oJ%ҭN^[?E1|y&a7G" /-ߔ/חʉm0S.LR[ } g}tO~ p{wyd̻L @;u!i2vcȟ.ɲWUY'a.YSz5_(Е"EdVrkm#G_v0~0$; vf.`68}+vdVKm$v-H`j) K X+i沒( +9v>Ȍ^S#Yؑ\T !BMy6tPO+gۜZ8*Ñw<] 0㬓6/!̲)c3 4i@VfN^@O{0ؗ7x㹗<ȫ=V=5w .5a -vIns)ʼ~]J\0ɐ~>I\Z^wJHYi(zZ8׋`ѡc^l/+I<$T2x!V &<-g")UKG#D`x01AD`gPĒ؄ A2 ǨU1kQRI &zq)PMӒK \D t|9qkxs.fT(5nT4V΢6.gRFk"7!()7 A՞p&m>>Tic+Džeb|პ^NS;bƓEHwD@4:&(fEwZIư#iH&6bǒqk}?~aPs\\!2bn9JJET>X!;^\_K4HNDTbSJ!" 0S@w3Y6Hy76vK98Z#j9|L2<&H"9@"Xm'5!eGaV|,<ñ3tvu$)3>kyGDzql o Ǝ(E*EDi pa/C_?(b ܧ^ǴC LE#L_bf: vxݟa{OhYti !؅!!B-0wRNOѬ b1~\3(\OẸ0%oiE+qpv~qQOi`Sp]yFqo6 ^8K&'7jTֈ귋ˆ.TOޔ^\^.+fHĜ/QNϪff#h<2-vF xsO鼩M eVy{؇Ѹ'*hx=MlTZ|M6P NV2,oKM,X$:}\ot#7*'Dž\;_ޟ~'|ߟ`N^7/@f`VzM$P3 hO7[9rj)XzȿL] ҸǖOMV$Q6_) \D2&痿_ru ͻZu-(~AEܗt1]HPtFP+5 ۭZ. q/8PH$9وPfTTRfEGuMRH_0^{wG1E} ZMRrބc "HO9#h3,g:՗'[>W`a>NΠS {89xঋZ+l &cfaΣ}Hcm/@z"ӒS4aw>l׹00ܟ䙹0mx~״;IyRtR$aƓ2gnehSr@AAuA4Rb7f2TĿ0ލ`ư'>Ki/'a7r5,]%<.su`7*D<;7LFl,Y)#n;]*G~uSZ$9&IςgvhmJWRH_%6hw3N]~lE<)t0w0_| ߙ]NF |oo*eut݈E~X\/' R/&σR_ E ^EԌY$PEla*.騣ǣI4`L V?z#y H:x/jza3Bg3u$ mb*-1V3H)8e;x-@qXn" YU.(L(:ػcb4%TSLQi5bStZzHY5 eZ3{-#c-Pe9b39-#+lާi&0x]4WUJ|zޢ yثebW 6 _ MR}Yf_~n=ȵUu l~Sm ۽ ڥo{7r^7/1oy0״7A-x,vWtn&WQ\dik -Xs>m'5'm[!wQY|>9X7_Cr.[n{ܣ4xdd4R@Tiry#4>`%u=jc`zL Zè3ٿO_.OG<^11F,F)É Q[#O34'G՝ɞdo}*åżDNLR%FȥF8'4p #p(lg:0b#D50$5I <* s~]`6$S{+%[mj獚5\{PKinHg^$0Gt?9*o΍"iJ5Hi NRh4HӨnuJ K]LȊB})n@()&=)8Hm7 ,3r#c6r6#c>]%mPgBaf^cgI3n(U=l逞^Ol8mꅞWt#z0=(EdA*qi)C+.NsMTRu0O~+K(IIER`^$*(R:IхT;9ێG~[6CyQnCNmxlD`@wS*]Ĝ:MKn3F *fRŐ7n&r!yh9hh5> 'H PCZ=r`TiLf9ʇ9 (kU6v zl zN)(wHs?a&yiyzDP0l€˚b3 EhH* 1"@ rH# t7DP5eDDL ` Xy$RD3[ i9.&/ץPڧ [Fp`)l;K*!HUW[W_O#B#J(#`u-`,ROrcNAkJ!£ B嚍;` !0J"F!#6_dNKy ;kq~|[Zluk50<ax钙U3G$2-lh:1r<]t4+Ghc%8 !%i0'YdMb%1Qݮ϶뗊t~fZ4ٶG!)CzR۲'2_<}0k1ۄZjluhn&Oä`e@$3ZѭpOOd"%řa5X2]|`{L/ uCɝ$apq zZe7RfxqfpEg~M,d|^L~ :(_Jpx S<aRE^tNSa4r 7㕳DXTYjlWk n5MN47΅^~֫W-StR #P1}S#9 m^iVͿ.s}w զ q3\,UV?njh j25ĵJ -ඛhZ55IѶM&kҊTk>GKVVfY铢{n& VN@x x]TVǰ@:*ٻ6$W:2;j]9f41^, cqHRldSeQDwrnnF7qZGj}6 M*\t$5AI?UU jbjIK89]ϝ].DHn5Y79 W%ttէNӯf*^v͙XK"6뻼hF:*rV)iVnk;/ ㈊޹2`K+N $c4&Z%.MJ=Q#bPI&ӻWmx'qĝ96_e Yc.oUadԪZMӊP4JE[TM"zPٶӫ@& 7RTŶ[Ф2 -B,|}3F˅0c2{\_~TF q?8/k߳6H 2>q>[l2z~u ]3)y./OODyy 6ʄ]?|w+[_~{]P,Wr@EL-܅Ƌmqjŧ{.p#m t>(wWNKMFol1A["m2|&nǼ&ɤ)ёweA~=>܉>{'NfZ-0.>9OL[c4ARK"@h3:imq2ޙbX 93l`u.=c‹hE;dcʜ6(,Qh,J^ 5 L!P5zޯ^8xY/.JK=#^hn y(ЋCCdRHK¿.[~0D75YkpHםkވq)UX3dOʆ$<ӟ03{n>rHLރ|7'wR6o}q{T]zToț+m?@4̻^ߛ~d8 Î9ȫOxɵ% yY}) 1}ET Eީ/6cYK`S @["H]2>!*\3-p-dA! 0."d#*v$5*Xӹ|1DҐCH"=R֝-6g2U*vwoxsOhQ".bjo }DĻ ۄ1xD Gr@ Ap4PPJ)rý4~SK2F"WMpolg<T&Fu#S-}֝-_ {Nl Z֞DE<0BDk*B&&v0,*ⅵޥ-ITF^&@ϼa: LKC$KAtIFFV/w&l<6$ӖC}D9 Q$VAҼ%jTYmBQ@ac9>k~kn;w>;0Բww`(qOqabˆaTB !(I)hcL X)8R2 W= MK0U.7wSR34ᣱ&A e<QJOu>튶IUk'|LoI3K|.PGdq+9߽w-}6?M>M;ns߯Dg;C;lxIAkumMPYPhHmUGR!V +Rj`u`FJ*-e 1,S{Fᜧԗo<+fu|hBA ӨOL2EHpŭNhψ;#ZK!WJ1cY.ѭpӹ'D0Y˔$ͥT)41ldRhx«Ec*$R woۗ]6q ֒ aZ‰d=@!#2sݖ#9i Y!3h,DDTP] N&y\ <,`CݓNT7~J:O`RĔXbG+#aѠ8:že]2#.a MGnخ)W׉Q 2EaMB]YϤ>qQWXVՑbX@EH 3Gz8mqdQ*-y4GGg9R_ppف^'U* c~58*.9"yԠ)]2QB9aVSb'~rvX<9-{ng)`=!ȅ@R tt԰UwN87(L+UG ZAf r #E^Mpyrz?X-)iga_ .=~]>_OeBF<]ԏg~\TNn\-?4^m*f_tFŘ]w6zf= H1M߬7]O/!]-fXw35l[Y%ftrNxyLv]v2T ׭.ZP|2j ,YXZj?MK7(˗*ꗍ"O? 9o_w_~2s^}w? ,QbA<w?'2sa-x4~7^?-J .siZ#BviDy|+*Ci|1!}fin6M>GݐJ;vqcvIDaGb#BuP Հހ/G7jvH~woFbƌv B*66o0hQ43de2hXET4{顝 p)+}#1=axƽ7ɥ%.!B $q"0I a'ȥ` M3^tMy>߮!w9:bVv+lsLq::*jQ켪M]ͦ%c Ϫ/Q9rnvp*r]7p2ݭJ{1Cs1 ]p~ỻqm#xKK䋪gpYUC@ZY ^:f*٨z U$1^[k@)U]šͣp#zY+Hq`]/'i@PⴺAE-|G@?5Ѫ>.\`qxP 5ghz:}g?j׳?v_zx]^7Z{R_!6R_:cJ!,=Wz%B:gl%cc#m]SG[õzglJ*WLM@+@y#)r,xAEκ4{5-⑻sYwY AяD)pWźDI-cT5matn{c9I6MU*ި/YQ B}EAK PrĖ^z[p:Z; f:@ Ѽ!:2D#Q\a;%P%Fwu [ޓFm/-qJq)􃽾saT  F6ڨ#d2EҞEDY,a@8SNL69|%EHg< q, ԀUzetGᄌSV2QԳ*[w84@Av 6s)w8 @![n_'F1]Su~M%f>*V"'<kwRxlJ&IJ)GjpUԀ= ^sk)bQjY":&bd& Qd<$Oߌ)uP)(pW5yesX4a)^ *\N1Wwn( R%yqjP,_3dPƾ*;MvXz f md" z ZHc#p*F0=nϹm񯏃|v6R wL.ԦQ-VE8aLR(eגmcg&Nmf51w]ȡ>#5w,8tp2[z~?͟:ZxX·Ӳdv"51`kGuR6猍ƥP+X IyGQ%z{!5͖Y5[n*1xK(Rko;EU`wi+Uwq]ؖz]Z ^KIaW U{1e6?MPNBt % ʐPb1rA.x7/ω7(ω7"MI@ gZcm0 5 8l8dX4J"% Is=ZoOMev<嘏cN7鲾H, >&FL>ż[GtQ򻿗x6.~6T*Ai]RiiЋStljOAVziz$>c В ؖl-ф scE6S4."8w&,R2 5 B=ȮZ.hܱQy8<: s;<;Y qu؋ގt%!~>=g)ӝR4)E#Bh'*!mq/MJḅjp-qBU%h\U0GRh{$frQe3vhܮGm]կ=pWLy%B-:kU9kٖ.m0~ڥ\Z_Y+9ms-JeÛn`&s ͕B nUp^֘..mM$[7>MM+vK"sŽ.lmnR)d^"jhCer) OW|u ?K_ w G^0oEN38X Xyo=>18-b"bfwqVsl֯I]?[] ߉:1_vfum6xًfh=4cL=|U؃oFItRB ޟfQT~r>.KB҆[}#=z]\mar*nә\cmѼ5 6}ns?AŕYJ6+齓 [d[cA\ui7\b&s*͕X/("cpY5x0ԋ ľݿ]W:_?8 c"\?nmXn3xDbjqHXU9;:߯G[s ^pŀ=??w&ˑ<%yLwoؐ-4U$7@YͱQJ O^{<}eE'>k D`=Ut-.r6$'EqV?c,!pl_}i#=)krE֫'/"lٸJ$[1;dL]ABb4R4Xq@ϣ論ێJZGZ9jSE rj6%*L$A0k#U&iKA/Yx뽵rtr?}7x[y?hSrərsresQ LY7da,V-«6gZFkC]nTc![T"Zzm&7yumhHm9s[Sk˽`p d2U+]@TPsPmTT .dPmSnrS[-4K5l)9!^4TUߤSWNxTg%Gs d'<y<ٻXGAE-Ŗ-Ԛ[bHmJM.cu!W>$`4 n\k`TFֳĹsUիo/@V{gy#]T5x1T| ݭ( :YJ,Be6P`n;*pRPN?P8zqkm.-B٪&P9NDcKS % jē&tߛ|߾DQll-R[r<2crJ"(bf׈Y7Ve2#y=|@nףO{xvӭ/RWnhm֊'4"qp.G]m8eFbE`#1Co%NQjiGU7fj#}>Co0=mWpdåA=Z!(F 0s(tCjr\ə*3R˓z`lmvB4vkt~5wiovaq˰IoptmH􈓳)䙂MP7mJUsWyAqmKZ7>{MNk0{ 0r? =*֡y@pFs!b   ^+` kBIuX|у%ݥᄒ֧K.(!Ԓh\@v|-6rxϵ _]g<7V^s?8JW%PfD DlHQluI/Uu5^%t"\+k!:y~ɯm]γ} [Wj꾼_V>՝/[g;z_3Pvg%/l~{?fg_fk][k i~t%uY/y3Plt3;w}=;Mֈٯz?_+ӥr [Uw~zl;}}/h3וgIC gl =Sy_d/KN\z= bJ6ROQ^rD5 G~9D aNb-xLuKo=f0&+̜_A_s{+Iޯ{ԭҍ2o)qVc;m{}i#8򷖞u&8A>}wC`BP C|4aJMz:!7 QMPc^^8dg+%[K44HJKɨ<$W!xl6g9+Aj z]\-!&qR׿6 > LwwG{'K!{1ώZ{CO4s{{SC?ϓM%}J|Ȥ|tReOT Cr/_Ħb*Ҥ$<g(k Hj.|*#Gi!h{7frQe3vhܮGm] կ]bns_)^11gb8x&ݵNFW46ă A2)yK d4)$͙FS8((rL-e,rzbq\ 9ʥP)A$E_KvТ1^Mۡģ"Jg'׃xaƹqCWOO>oJH`'"!^p 6pu*8gl4.ZXɓR6)(ʻDo/$ |9EΚ2S]Bįy&~[R:)iR/i#Pb ɠޕ2$j\'3Mcz:֤Q j س y_6Qąf SmJȧc7~J-pBP!sp>F^LJ2;O^e}Ѷ2 3ALhX`.='J&o Jiq n;\Wi@%*QU,JMMFR(v1zgMi5_^ypvAhAmg3't_{d-XH" `3+MkUB Nz ?Ex浝 1 (#)X1j?IS࣒F96 IZ4ՎrV;K%m 5Wғ"{$֔`C1#5aOs%HֳBl&,?{ƍ)}i$rݥmnsu* YkdHʊΕ}3C4$E"ڞUȼ%&,Rđc4Zh48$D˜Q*dt*dBôUB.ǟ\EjP'FRpV|p8T%g[Un=c6qz7Mۂ2OC4{L! !(Re>L0Gy@7|p(3^/d]bvq>;h*ͱȍT~8|X4- 6Dυ0c2=tם8~;0h+x: ޝ5_? Dn(arf:x8ztü3)Gy>"O&o"RdYF>b=i{>(?mn(6nP5]IjڔWi:&zbEMT\㶹wl6ms6,fLn/nx5,ΰMMx&>̪.rtn6 6V]\7O#Xg'17Xj O#ט?q~SvC4*  Q.*HYu{7JΪ-{%ȫѠy8:nP ?Ȇ_WiB|6M ֛'>zr8gqtCnkP4շ_UFQw>X$UTo,j棯hV+PT\M]~"cReU&̺֝ȍjybVdfӢ1^qkٟ(AJ eG!DcmRЃD>nAkƞ ?g/Z&_;) +IjtHL BlC8ekJ 8`F2|Rs:$NHㅕAaT(FSpO< _;eXoUmu rUXu_0C烝7 +a~Kj{P2h TQZތǓ~8*fCxuUWV] N]jYlM)x-?=TSgMW^/Un1+̍dˀœg׫թ#_|W ~5qy&ND9^e$3渘}<4"ѰQqJ2,Pί?PAXdb)D"r~„X!1{Իe]AFZ~k^[cs&"0g0^Ci GcBM܃ HIQy "$y[mB#*qMeVj+k"by;BIQ'҂ݻ[kh|=j5i-k3=9~_ڡ<m,Z>ڤ>/#6N;4bbn=p[7zK iʺrs'4L@)US*QP)4!3қSh\upG怎AG yHQʂ9w-I)s3n4N;Cp=ir19b zAϱ|oa0c:Q"u#U<Ϟ:#g]WGd|f*5TKNnÜ՗gUkI]{C[j jk3ۏ6f޽d^ y=8[Qp@lmRHR6-rn!ndylܮviz|۸ 4E+-7t%iCW=7!ߜ)i3..VD>+ϽbVm*EWZtJNp^{. y8.^tKtB;cd'2vE\]ȕLDIwٻv[T!ݣ3Ӧ{1ij TIh1Im &r"i"AEq19eA0$ۢ XB H6Lϒ"8hQH4#2z]\' jglTJ5S4Kaz7tɗ0yذ_GZyc3&iwiM,߃YfXrfFCBЩfTFHS FDZϢhRNCTTqG.Z3vF:UӅ]qƾv M 2;ʖq KpfY J7sѲ\cxu-bRq2(FS.Uqq"+-֖'b[yYV\ec+5ZMU'&&l&)H$hӱۏGq<w쩵Vk7Z`:"J\(Djrr*y<:f6k'/U&Ay3I*uCFː & 3A: ?@⒌$S ڹaglׇS^ˊP4bgFkDk^#nYcRyE4K[gIDFaV)g>DTt\CJ}cS`(PTs0!'̈́NSQ,:#gFltHθd_:֋׋^\"5,wx@9ʧ)VCчqǾw~TX_1\~$>g<$,F׎`(ZzR* L!Ɓ1 =lg=oJ ?@>lp5.@'Dk Fxb)*@D(nu"rM K~_[ ,Z 7GA8k2At*ŜI}ʨrИ` &HœnXx6q ֒ aZ$c  (.AXy10p#+ (Pߋ .rFJ #zJ w` E Isʸ!"xA%R0{%% (V$4BtGh u+H#.a MGn8II*͘: eh"rT w0 ye`% so8ߧif[A.@a10) _]koG+}.C!Xxmﵱk_#Nv@GM["Rz(hq(ř{OUj L~Fjazx?6jO҂Alvw93o=j!( }ЊsCύp{ zzo_\RBIR 27@a|8sMQ,J/Cݞd3W'1#U"Og7}_c''jc@:ugo1N{.~=ucf/`T[KR]}<ՏW͍g 5aQٹ0K>#.Gpt[̇O!Z[ryKǫa1te3S=pO8y*>z|f?sfuY'W!$:qwiHnXp-CP S>jo/]_pNܾ?hoW?/^??;^>{2s߿z+8X{Dbc~\_%/qsang?nO+<5+RyўO pVgєX9,|k?߬Eilo47buN5ڕ.hWe DZ1!vۑpxiJ,6_O4=ōe,U>`;X1‡yyFALj%+A*"\\龃 3R7.v<0<ޛR?!B r$%8B(БK@˞N'{:c 9qGt&Iwy\.:=rVv$䢒s. 5ع$9>Ifk5zP/Jv?M/B&IibVȺTשng~/?蕿EM8>v; ?5X~6:7n$y"NS[.~{In{ͤo-ߩe@|E-92Ty4y~/7ٜYi {ϧ{--?t-^yOڕ>^zR7~;|~OAZ{b#i9lLJv!%G\d(YnPEAę ѐZBޓ) /4EBƯ l>zӶ<'%wQݰfZ~ToKo?mppվ`z]C650PUEߙ`󷯫,W v^I\mܫp6V\T\ ySbj&0eۦ)PB!'QKV Nbmu09 'ە[mFۭ5n0Y+ʶb)- h.YP% dA,U ռdAfFɂ*YPe,UJTɂ*YP% dA,UJTɂ*YP% dA,UtJTɂ*YPs#-L„.L„.L„.L„.jHb &!UL]XJ L{TO)k]c=s;;]fHIRĝg̊T QNFA6:(z$ pփ˘f ,KS9pCq;Y՝lO]ee.6:6#fX},_ůND&QZC0^D}2P}+QI-HF4)UѢkǧEFJǐ7|Q5ZPTCIA¬cCXd/.%eKug&AUI !#zj1!15eZiuLۣ_W.EQY٥;^qfLn#9hvC8mci $ok"XAI265K+c7Yʖo26~+qJ˫_Q=h}C&^uM+TCg{u#{{KcoZwyvZ&]b̻4" iϧSSҚ .5hv5?hu~)<\Bz|WbMUllh矞WOMp"(%N O߸NF$x^Sj0d>,eum])TnThW͟nph(b"yn&ڿ}fj]igIJAF2E8J[<ř*8{d f^+݃kXyxoz>z9&6|Xbx"&?&65;:޶~ļ;H謁هݭ@iꧣu yRlYB[W)aϠ{|x:`tL!/M|n|qsC|9 th|D_[bC- 3Nyה(-V%nVΕ;TvtXՆBmTohRc I{yReLC8&f]+) F:)P 6PV%YE>bPdjg)lS[0u^<+B$j.ܾy+7gh7ǽ:z!_&:7fv{LWQ_1%@_^|E hU\#hA,!ߎCZ[N;w"֑_b!-V| I<ݾL,JtW.Ujtyj4j_ηKLݒc͏v`G7A0-d85>Kҫ_} $4j@}m2kϙ%LXrÞ*Y Rɐ8IH@4R9/ K&l0zt5TzpАHfZs&scw=՞J1p.tͫL i#8BSfl@qn헿[D~zJF$mv0\W#E0PGmiIEy;Q)z{$'L{+]|]b +Q,ſyFT5`ܒG'90yD&&k/,DYoi[x^=BCNPMJk~re:%!&jb!D-2'FABq[g;IE ]_л7p773&?KNi}Aʺ > | rY@.|%ft]KrtE}>*\jvZrs- )YO4E TDAXi<&PK17sz9 [5[rgS!wϲeP;|lY@+6#-z.@:* P*FB.24ZFodT2yU`)U:B8VgMrLGsIwT)Kt&g== 7_{ܧh<Y>gu_^Fã#^/lY*%^EDZr@c7gMkx.]%v]V h4AZAM ]|dj̓FG`HNP:sk]T Ks!NO Adk ٕ8ptVOGp``wxEcv35 FhR.c@1|HU>PTu<MAѻ\hj'GOx%LZ`ѣR-:]f:);KW  ϱ =(]pFHbp^@UG011u f(^g&tB92k$Qiը$(7LgRIxx0s2.HVު@%|풎*&DsBh^DDlx{ĢA{AlT!P(:C:]o9Wr3nn.` .3e6t%Z߯%ǒeeKqcY$~b#ef*NJׁ.'L2XM?4cX⡻=NcV0QI)%3Z`|2a<}6\eŷa:){Bݗ8W5l9Ͳc4\v%lö״HR3yD M1Tc)f%Ћ$* D O1z?ʱ?t\2jJR }aGaS)c(xdWsY6)re%'W2(:o6z30{-#c-Pe*|W坑&w]0ϾL'uM~2rw|Ov$-YyE#}M/<\;n>wtݜw=sjky5IRdm:AUg@b2Iח%O˺a9.lx BZ nww+[r3? O7ֻ6<vo 5wtHkl7f 5M}S&ܕm3Cˣx\']'JTJgd4|@wUyaERƒ`J &@)0P๚\A~@eA11F,F)É Q"ΑS# huowmCHc`L8/wTRUr%FȥF8"hF(Pg:0b#Dv[40$5AZX,@vGEwo> ̂HB9UaaR":ȹ[PEzY@- Nϵ< 8Ҡh6]r9;2k\[ڰk_@U1NQy,c±Ni NI hQi G*,u0SdJFHqb\^ -&x#32x5sf"T9[vt]qƮX;Bc^pm*ikT|u J3M>_E[YGs=(EdA*qi)CȦTu, sNsMe"`,$&Iy ZHHN$E"rRu+rFl;%PP3Q`7q<6"0I;)c%1NSoǒ( zQD\xbo&r!yh9hh5> 'H P .rAuFxNCAcWD#"Gč2! +$QSm48F$=՞iΰhu m*0BB2g2zHL8cu4(&LR#1s@SIw:#grjl\ZϥCKvE1.{\ܘ). RIa#? KGqq/x(xw슇c< Cx ~nnzǧ~a.:~|/2/ =\1ő{.J*46X:p{g.NQ> M+?f_Ǵ;yzDP0l€˚b3 E% uѩW~jC4w.hc4 Wyns0הx6tբM#䴱ЈX,d$(Ąc+y{A C` *Sum5Ac bG Bc. 眑2G`4֙˔Qydc2E4^JJ8'%rg%4㺮Q9wz{H;Ӗ~@w﷈\n-c;zRv3I?c#T9Ɍw:")xD;-#"*QE㬰(Cꁧ}|y7F'i L@74pto,"g@3 $uݸQɒ{8Ο9^N>NsjrX B0BapH ̓2I5FYXILT/IAV9ºZm. EԽ-6Dv/uIFv{9XXT_2raIIkyS& |,g]\Ӭ c w .mz)E" FW88<"D5x{4 p,;g/0qOdΡ.%{3Okx9gxTSBi[mPAcƸ,QpIb 'if( hFa,hq~,lrYTË,Fm}ƀnR@eUsyY%hҐ_^3o]%܃߼Jk}u}S ׮ͳu[^Mdi7F5hcTq`u~n%]tOVݿ9`~~}czK+*XګO#y%0j4׺(9՟o+Dպ.)ڵK:W=sw[džIx~]4jY铢sox@&! VLށZY8mR4CAfօnF{y=qS, q]kr@GR uR"6UTWYWwLEGtCZ5kfd6dؔ؍MPS?[=+^ +X?S 0$TȧO|k 3{tP$ʮpj3d{ݮ(A[cH.w2ҡtRX `kjgR ZC__Q7ÚO/w̱قA3a}t֖*^q&?EA;%F 3ՠXYށAsUd BԷ[yձ$Was|Tmb|i n6mf "Ja[;[ip__Ofjߓ6 MO_&󉛌|}GmﯨH/NN9yjMmVH[zU‡nJP=)ȗWzDEIhn-Zں" {ic~u43IMDolf*6Etۢ3hަtW54%:-|ٲEŽy;DD%ʵ[Ѷڨ;N$` lJɱ!Df!ʑ@T{3o;개?wsΟy@t fu4YB`hb1R{F!) neǩ/%t0JBxؓw%Lw鏑/#3dB0Je @X}I"b%㖖d&Y\#xc?08:_cSBhCbϬ15bVUpZƣ@AA!ƘPL0r#*Iў}x~H]a_au!zݿ᫵jU. 2͝P  8:H2K F(]E2Z*1V3pTtu1jvS7-DL5k{.-Zg{}V{5^k|$an$yTo",E?{̾<`N/K,<\q}_ׇQ+~ms=JE["S&ʲ# ,g^!ga¿Am6v|sIG[ddtgIhtV\߶]/bYN[\xP 8364hw=AѸoȗE șqڍjuy5W5`/?53iFEKXZxŗ۳#h8sWK[m49ޯ +^Nu,0a:QY0`ZK!5?"<2ٱP1=tJT. o qM歟5{&F&cxWJ<[KJf"/[?O~ 9Ф殺oҝ׹9&}i%c6XS`1xB|`ޡՏgū9M6m/ᖄmUnZWF9kl+)=ob T$OOݏ\EVUQG%%{jWvz +J>J>ey/jW%\գ+?x*UVC+Rc3+ IU"X+ Wc},po &*e]=GbtL`"LUV1ԬgW\3vȕGs՞ Ĉp Jh1iW@0yrKLp *V'Ǥ]%8JJq,p T7%\ R$/1).`=Li(QdǼ&j>剢EeN>/ c(^an81{Qey-~h.Ɠj/K󜄾5q ŏ5j2 DrRqXt#wp\0sϦ4K,EApͱ)03[T_.j洨> s&AX iM(z)J;۹>d;{;le<e,!]Znu '1bePeT]'bgLjZJ-Ѣ2 2;W>;]w.6YZ@JF/tJÃ"(Pȑ~ZI j. 1})Q"w|%R"xΆi;7ovgf 3XB#F9>s"M_V4~&Q)ۍ/giI! j 9>: 9Zzèd [ %+ҪzmɞL&Fg}А7qv#bBkɎ{fϞ7tHiyFP T TK2y.P-Q+HӡZi -T{F0;/攟J }.*QtU:$V]=CuE\%gTJƫDTzB99#u3v>*f>QpURQ]q™8'usr>ی\E]%j9kJTVrnZu|ԕ srREݰsQW@HTVZu|ԕsBW@0f烮l֮Z6]]9+EzҝǪ+ XQW\v6`*Q)[u,Օrݸ{;#T %ݙ|5Rtꈕ߯4eyiR¬kV nr[=5s7X6WQan7S>^\QN gˬt1c&L |X:Bt(ϕ_e-[Z jL$fgVg/G6%p:wox; {IABN$n~м{5hZ)bZ֯HTa0z_/ /}S4"۵͆٤QP<%vV R$3{hJ GLP-#jE˪.uҤk1jp՛ivź(H ᢏu+6!`BL1jve@4KȳMb*=19){) oljgKρ!R/%$slx l}a;xxéW EVzk^A*zJ%z1ȭZs2g+yc$!t\BLh y$p^\0-FjO w_?_ i&Ό|w K{;@- ^ig]~Q(g|CZVN_e,TE?;r8uʚ3a짓!АY+p "aHwS5y.躛A^tHg{0gE-B iatH\Uf0kȡ{Xzx(<],ޭ^?'d{Q.ipݙBOtfۢBŴW)-(}M5o`'`>%*|Nxʌbō*0E@{1pGrf^leq3LJ|WS; oBz{b|{O!)HvSe7 >'~Ld<^Ntsx158{prz5VL4ZKt,,%$ú}̊\:IʪFY4ۍV8vxecqy^~w͋y/{_ 8 MOebzb 7gVZJu~bn_N.#h30ynjF6]gMw؃>_|X-E;1oք[vMۧkYlG9~_0{dկe- .m@+=.[EMעO+,S9_mb|$|Tg#F42\ >u:hXPapQyTXykGzņr =zzkBu1]^*pzwm/}~xN=luγO<(\̇Yok|7{DgȞA ވ )"ts%xe[Χ|йmo@vkeC#2-bS. ؂O1F: lp6HF'g)Oc.7k`;q١#S R«nͷP)RBIhAw\@Pn|Uwx|Io[C` n!E?Rd Og`^US-j>潛 gWC-G )xYNƬ\@`Uz;}|;̍]i%t[CUXv@-fO{RRQU=*@ 8Z+ZZ+JZy*,;|<pN6|PaN(E3fͬ$C>p2{Ϗ4yHGJx!ݵ4f?:%%R+uX#D!=цc+aE^8JqTyneZGEd`$w9lq:]EbD ԑk#ZA=;t8s:{4"mo.rA=OǺxnKX8nۍ`멝SLO2iv+ލ3Kmv#eRHQFXSx 7;!%%r"dKMa(O F$ JaP˜DZĀ7‭iU"bK(F[ ЫbМ5Cȹ .r CCPy*R}wx.x 6]?f,U#6Vb't0CĴxDxUV9!.5}%laQ`+GNJ8>'­U.Jg3k\6M>_7Ht/K*Ljd<)>%ZH0<rH+}=Շ$r(s[^׵Zʄ2a^!X8*1j 3ꄷٹt|i߅}nDIm?0uaNhN'€!JB"Pnp漈9 G Q&%!o'h hy%!1#A<4Rc ;R! )HpQBqHJ'RTI ( e5B& Lc,-g_e ȹW%ڤWYK&9ؗa\ض8DٔT7a%SWm28e9FU{8X=QX*SA[Y%7 1Y+δ"5jBTXEI+vhfe\tNcuvV93(* %B# b!h-@ H!UR aEFӚ6rULQvS0pF³ Sd&'S,HP\x |7:"P\})P{SxeBy]ڢka[!XG#A^舗ZA-7~R'(L^q*} $zn#3gD0Zs;R`X1ӭIq48Sm<g x3UF()g(-DJ*#aHs툨S{+"@[*W)Wy 颾J^k[tt:CG s)N^ 'SxNa2t:RUPu bh6җ+Lyy)+g)S܂Q|1fp0xv!YnuN+j/TN:ߢ2xMLryI($]ch՞2#(d4‚SǶ|V6}w{mw&{?> vgR@1mUt z'LFe1ƴ;6JOy 8NzcE!FCGJ%Q1(t@La#c^"% |1/ kBsHĆ:w-չACgINwYYZ]yMTS!"ࣅ'K,/hLLV#Dn\_T:N^ hF? I`O ˈ-lgSj渑$Z$G7H8B{cB:PƦϣ)<19uGfԇS('w~Z6WbVP[m-ы:k ; zs1GłV1 5ˉ%kHHVi-:SE\YNAV#1%i0<܁KH\+I7 `() Fvы0q:92ww A%㔠vGkn܈N+p}/>\7oPOo 9g<5 `yڽGdB0L%phD*Um$N};#{z-f >W )F<~)-Uli!XQެ> ʼ Zf<֚CeÅٷza&ѕUGTBWŅ"A19(C"ZxYDttgF,^ju5FNq203QLhb%S B>R"p3DNsNx.Z{{#g }Lu_*}7u_!'Mk)&eeRj*KLjmIsLkL"&DɌ#%JETْz! $ %Jfc !"g\_g&8-2~O^k |z}99[!,N{rvࢇ[{*dJϿjv˗1H "tJ![Jp'w1'u0X͂0Wdr#kkS z0hq &`{UԥJ% $cqjҸyS+ +2uEحE Z&CR!#f;H/r#vӚy,%%P{`7eRJ`XC[N$R*~!̡g +،+f%$ $CEG&!#QHEqY%IFy퍜x(ˋǂ}zFDq@čas罳V(t%$$\NIpZ+1dp|,0$PBA$#y.8RVF F4i.d,zF5RY2QU7k싋g\.1#P!&R/$2/?r^*L) x\<O` ԳI|KP@cu0fB*˘ hC(H ˃F 0~5-D>s-xe՚yBzJQ( -`TjVwilJx1yGٿ*R?+'w^q-T 7eve;֪KҬw^cq߁N1|50ܯ/f߳v꩎ϸrQoX$ZO }yHtlڭ+٦}qwO|Y8]MLƋ*죆-7P"rWF]/euW5j|pk>lseNԊ_S=\h2)N\t67ljV'Tx{#Ȓzv>9:gU *4O|AsRf}7= eWvj;Ƈ48nu9~UYt YDW}LudUഒ4 g^3},kB%E"G߱ k*tEij/ GϖIn8̮.N}g@y@R$KZ8mI$l;uefɢiXͣIX(uӛZj]SO?Ozٰi3o\]hekO_\b;eYEP;Uy||:OdtϟͣΛ?|TGoҎy'i4DŏPuӁag9&+yhl#wWo8OOL-{RsS PgHTabf>0 q=bEUqީxϛm _!mx`ڈeYo-mΨMڍy!>[l!/Vґ0*tqӯ|f[H=5¯X}8|R7s(q"GqF:Qbފ79#FX|q0r':;.}gO=xq}s܉Q 6 KLdN퉍{o\RI:3::BVjc&%WǒP'SW./'- }{04~GX,|Xg/2ME;^L#rz5ky. CBABzvMo&n8<.&y5-O7l% Tt@5Т:Z׉5FtTdZ\j|\}]Mdz?T-,WS_ި{r9~̬+kɳvcu=ZݘUv0jŁzKl,9rȏ6DLu|{/zۣ!<3YKDHV/FXjs=y£۳׌(^ M p @ʲ )h.(SɘCp`>dhV/0.U]2D:| 9,b ڂfrhE`Fi_{m`o+j!uUJu_3CGlVvofe0=8c\-%lt|^/1\ePSګZ#mzHqu .<n/Z f6TM$)*|!(M1iBwg])aHE5"RE.W bIx,2SXOڛRfM8ʽ;ǁǎnjBHdMk(gQ'FC@ $ɐcx"~+^f6eV':eiFVvۋk>I(5VhkWsjWt"$lUo$7(H߽}w-bm9~3wD>y/='~Z,zlyov4VVkM|[[uAۈZ݇ i=ERwIkQ:6ۃ823QYUgǰtua q-婏ձ= h}P>3L,d=(WNpᄲx!bap=D]oGWMpm]M?:iqM IVW5|hH1e3zjb )E/( _Vgs.{@ST wT}f<3uu\lxh5weh?.g^›7=6Ո}V*Ih爮;4o m O-YmV`i|PGw[mۛlr0hoyW @GüYv4] +/\Ү'VM֗ شWQܥMQ@=J`ԫ[76¥--3M?(a.ԢEt+[ڶe6%纣#+!PV5tpMk+Dko]JF:Jbu'"RU-th_Z+DiUGWGHWJQ! `I "W^+Du QVfJeq M+D)RHWF)]E]YάrFFUYP2)7uGV8Q$^~Pw:f ~g.4rB婼=,M?cme)5w~۷'&&0)L9Y7νs5\Ihȅ^z[d&y^ld+la=DՉl-"iAnRa1҃@l2$V:}]7l+H5Bp (1dprwᵚ*m[b`-ڳΌpek֙Ktb#Je}9.p)¾?GW{=ۡe0tJհ؂DGW;e3Poaty90j@=wR&^PJaSVZMߝe*yogm)5bShM3S:J?Ke 2{X.m网W ރ<2c钩,JJRNQ)-5 h&_-b'x#I?A[XmBT2\SU۵hJb 6k~A7=1ݜ9*zvS2^n)(]]juiBNl׽{6z5|S WWn!\b!Zn!Ji;SM5Y"l״ƳDJFYGWGHW܈/Q+]!`ZCW׶ƺd]#] c$-+ltk *B62]!]IK -+lj ]!ĮM+D)ec+e%Eth ]\FT[ Ѳ]J[h+lڳ2p5am+D2(V,W+ִeA;G̹,3b4'ZM¯M1;%o M#\4×CráM"`}# 9]m7,f'+ծCO+k[CWrv(QJj]!춃kU[; +Dٴ|^865tp!m+DOWiSut*t%M =+i[ tBs MEZDWJ[CWUt( J1KY |Vp9m+Dk JAc+ͩmUheЕPtBc+Åm]`Kmk rB+DiPQҕJ670M2k(ZY2gR.Gh"c-M) C&[+X,Py5!{%QSF ,)&W\-P!:;rϻPV=}wRŻƎOFAycvf8u9YrQ2- x}l9GuR4(U!,p>`6I~0R7yǕՕR; 0qč۷( "bwzfbĻ?߽{6sв\c l+BT^$J&IJ)}~?-߶bg,?D>|x5Y^K+ ?~\5IVW~ŒnݕS;~-AGwAT`BEn2SHj_,dTUDj&Lr-qi5J3w1+bPDDm6<QeX pxEV @%r7*&0Ii*b>qq-NIJ1a"_&[W&qOƣ`M¬)ͩ?SPϟsl n,|bh:g0+d)5hMݽwzW!̋,_'$Syz4LϮ%~]ڹoπ|ڝN#~(`N)bRѯ[|^M!mCyV X<^zk79}9U=N~]4i`uOL}61ziO]*'d1@j3Gq, ȉ=S,e&hbII|Y^gC=+/@JAJPk&* gTE {J3Kt1%TQ fM;S_<}hIh$ )ei2٧MP+%&46UfOΣW>ReN)SvU,-+Q)gY瘲vh)H8V$"4𬌁 GM1MQ.=4ד 3 :x*a%y9EOlʁby xiZLRF2 S A'͜kUFJS;`$뺀6Ej]O֬|r|GEbFsdIGcZ3NTYxN84,1ƉcXScea4iv&!jY;1A878 -EyKeeI/@Unco}Q|.o.Gsϟ #XOݽdB$@CC74F!\M&)D`Y#>JZxeF??9]= ۾tVM~?S\bejϠ4-%Slh6V[N~֪>}ĞW?@P×g:7iֻv>p+s@ g^>g\l"d'7MPHEYPH~kH$D.3fWx% AaasF6Hx2b4 Ϲ85S.+e=0_9`H \10^49): K.(V.Q7@@6h>iv7ƨ<QJOuLyk6jiwSqkj8wp0-Mo[5MSNJqt˲0n BXGɝn/42B>[k<;4;@֑R\2׵g /(UW(w(m*rN&PtN>N]kJǷP[hu|հх8V~]j\6T +/]|xvs[kIJJhiM \VĽ"G#,Jrk9(vwڔ"۫ T9_܈A,'z>^aoh=2}6c~>6 ? Gkz=ޮF+DF^ Xܢ߆  lZ4)`(/jgN f /#γdU ^ΞǞH!_{B"ŪGy kC$`Y ),|dƬqā9e\J8.pR͂,BFcBfkôɬE v&߁L>5c>%)Ogrl6kwQ Kl(penq;B:{e B(6NoJUlQeFJKy\dYY)Og 7r&ֻ/fMRtD}2@M*Sϸ(X8e%#)*k6kAʜd 8/-Jm\~:GCGqvٿjι/f]f1z{U~d^^ *x2hX\ƏCf+M"N5"§ Cr֗1[%yx h5{Tx1..X r|&LL Eʜ?wreV1'Xr'j"Q`:z!wemH0ife#qC;ӱp b&ibQe J%P(T/0x/E_|ĘsoU=*d{M]KFk9VZ09K 0CQg28.TNY2x^c9!t}bߐl|lJC3MW+P W ;g| (R 9-1uwET.D͓P&ݐMҞ,zaeC2~)VV%.C'Qys`"8RT9hާ {]y2GN H>i~5_/A|NnIu m al|䃺f?\>!,i% dRj.uV'+r,Pj A( 2fUXUсtr9z]TDnsP2+“Red3Wa g==^f+cյ'_n(eHe[ %gÍ}OZSIZFdߓQj3$g:#҅2ًdM7k֍0-&FChӲQ"}YYOMtd^@nHhHV&zw{^;W,:i@6FalĨ],I)'BtIhte'Pv/#kK`Ȍ0%mQ{qA:(NrLZi:#ӼeV>EZ9lbY2!g1~}t,q)QQjaIzb2&d(IIM"蒴\=Qre s4֤֡ zPIĕ3dRjѱ(kWO5#wy GSJnr(IDJ5&gLfH}06Hb1s}@SloH(}l tCl$*1䓤`GCm3}Brk}VNT~z F4Acv-NQf=(kQW| ]!̣Thd.jDHH KC׾ӋC%uwrzg3h8x Bؿ:U'|אMHRAKbBFz2sS|ȥe=LۂtWOHSz NVwUv3(T?L&ޕMnE~L&2G2-zXrCL?TS!3ʪ&;)KD JƬC!`0kAg" 6ZJc1NOdKpN(TɤS*slRAe/?82,Fg2@D pܿ=ji>J87hu/9tLR7[nڎ%[׏ʗjGVOUzߔ.l,[ftݭw=>Ív mY";|Ià1nuun{@Wm{OalRˆZ6Ƈ;DFPv4~w>s$p%4=aG6m2P,Ab<޴|mnNgw}\~.ke6(FRָb9Ƒ hDRX6$^Mv){WNɮ879 d{k(8l<#W, ^CiW* &{m}d==msrwOǫ難471+$.+HF"kKJ1{!%Rn)$q9@dIe+g姘!߼\^咘<?{[;lGTy^63mdf(I,Sd*0aH`IǠk! SxfyN!!&Ç=sRH8p^$E6JrJfQ8 O)t[G8^=2r}Ʋ,r=b;ͥzf[rJˋgBp7aDAc 2d!x42`dg-U߾W{l)K6no)64l,ИqkA۫ڝ狟 :*?iX-~}}V~|P$ /~ZV3M~z{/m]lT>_6G?$y'3-WHo2 24v^U[c{=EG^w,娻Sf]A_z9cӿi=F{-rTF-Nwcs1b,❟xy=@֕zI70_/;4xH]&:apΨM{ U:2pO3&1 IGm"Mꞁ2GUU^*v`>j(}evodYSֳK2`].}o/kmHe/T q ]kԒTlb==CRc,Yd$ptwթjpl9z&o2b~J'F jr8m~2shE6C/PNzQ#SYi)$zVKp!Q0A] e3jvĵ4tvXf bX~<õX 0c~Uo?ɫG+?ץ 'j~j*-=g!#4P={jؕUWv-_9|~iGgj6z+ |m0"/k$xe`Vў!˪gUby}Y>E'}g-:wdeuu0f21i%]DLHR ӨGlI&'PLx٣DAlK2F"'763sTLL4S>jKs$v]38+-t"g*_[2~ `R0XZ/^~VRWVl]vL˥'3#WY"1eDrʚK'RygИ` 9E)Mʋ[t/:>X\1dEU LĴ< " Ndc hXchDrP,}k+U,q3uZ9#bqSHw~K1x&,Ɨ3 Qt/"aK2M+)RPY)4\r%+/+Kc+2V$H4$h {EVg1QRJ'4KR]s)2)#1-AarO4 u>;&;h{By?ul`?[bEoM[ UQi5 w2˰NJR[)#؉RscwZql?8:a>njR'剢O BL%?%ZqNy$d F vKq 2y`\#_g_(^oSf?7/|RAbc[9zh9Wt0>LByM}>߈]+o /شIO]F+8iVPyadf ȟUho47bu Je[vaآpk 1]y1YHPp7 ㋯G};H~G#ƌv B憖S9haD2( g.[ wMsbÜ\zk=Iĸ&r* l LRXHJ:r)X&;NlW>_W2f>.fϻa<;t<2|]VRk:߶38pA֙&JDo:>=l籱Yb=IY-E/ÜL ϒiAҿq.*t%@,yNu!:rGdv%KYxCi GcM:OZ0VFEL1My7멆~]ߎ\M.b{c*ΎcM)/Ӱb ?c~U]vyPx߽jۦOaRߴpW6 6u.\˯6!e㎋,@-bDN7Zɔ]qZJ`C$nTJFAۼfčY'n< 5fVfg]JrZN JtUt芗û]L ]o;]v]=Fv `vJmw ZF JtJpe 7] ][OW垮%])_]jw; Vn;])c eb+dw o*c#*h֯]bhBVv_,oR.oݣƍz"@ey$\C5M[#Fsڃ% @ɄDoo+7]ߔ `;7 y#Zڭi$KM[ϦD= }Y/ hƧ&uj68fFtmC1D`~^@ta$~6z^\7J @Ks3ŤPtJjipzfsqT򝷯?904atlbOLz>+&LyoS</ѲYgFLFiޑKQ -Jm5WU6d)$Y9 #%*rOb{^z+e{}d6eE~W/ٓr0sΪu1ʁ>.sSZC!6\] ~+y]P۬8O s$6[np xwClE܁̞֞zA2|zr2UO4ŗΦqNk=R k]7GNqPpϳrtT5 )%1puޕ?ˆOU(beӬ3x Y0nZ]KaZ]8)üB&]2Y&S7U9 VuB׭bEc* ʯeưv a=)~27R2j!jY~1{ocdL2<:I#Ty+'>Q JfeU30Fg ~< 1\!gwml|ѕr܇oczaL_pS,;4rJ^['9p%:"ҖSQZ#Z};/ID5R8'.6&B,sȚDtp?6T2NJJnDj 1 2bmEm,Ko%sx="ŀ>;ah50#Thu*CB!jS%cdX*D ݈Vgs.{g+jE5Z.*e.U,,eqs|$ZDed*'%V&*\"$[ hqiY BTʌǐ) iePc}`9GNPD sDmfa7i)tF 3BT !$t.4 1}Ta4.A+c|Hhh.lt6k'B ψI1ZmD7p˨ЈĊ5Ič? rʂ亜Y[)`IT{aJ?ˑvI):oBE$,.wkCkZJlg DCFc(D%e(]:@ 49k)"V ӑf` %<`]vۂa8 {qp,okPV)+m!p(|"𵠧>WevvְNN%<ȪT^2h̀d!,8Fцne­ ,vY;eqah!I s`9b<}x'QJ\+X"5R2` cޓY1zG"<(tw!\v,6x.H%\aI.)d9 T\TL0kl`&r$X+(epK]]v#X Hûy.%PF W .R, ɗɲRoKgJj9-[ѵ$sC"=Op}F V <ʘeH6WV yx{*O׫5Ȥu@t3u<oF;386}ť"'dNUDJT9awVT ' B @dވA ۤ9mEdh>VR=$5$YNx"R -=z+, X4ݭ QHV Pm:hI^`ȖZ,6zv+;z y5ۓcl;u$K[ Cn=x@ftrl'sfc `&S#Ibj[ѭ)+z֚Bj ypYC%!7C0& L =><6IÌ8$ TAI%<`n@ȡmI/ȇ2t#b➲s$NJxt QAGBj0FK[Ƞ "!ON(.[BAv[~E8iJ)2H<*pL wp'-a* #&B)J2rXQl01a)-й'GBU.E GS{[bc3pFO -rz=DJQ|RUzfRLF!"0՜ sȟ@׺O^Y%ikPa(F%DlaJCk*Y[dڠ@a'`aвMbHHIs!y~#q`y0G É򤹧BSp2q /]\0FnmkE!aކ`Im6g )H1`!ޱ4M҆|oBF҃Ct:L]!)JN-/ 鼭'3tEBޙ#'~b!TCR/ZbX>v)r\U4Q3v%4q[$y*26wRgB]J>ng;9-&/!vd-b9索\26a{Xovc?.C7m]'5j $}& !zv"t$Tɩ U|S-)MhG 6ZFJJT`%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J Vb@V()LCJ E3J ؊џ D(b%(Dg@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+.X $fHKJ $ƴ\̙@ֻ+m`%%*T6PV@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+J nL hH 3V+ V]EiV@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+X J V@b%+.G zc{b-5Vo εv$/ i%`vK\3% .m¥K.n8eCtEClK"Ja%ҕ;]pв"VBW֛LWHW9]CtFffAB (PLWHW^D_Cf{1yG{Shlv:n)+n[8[ݗUDةSdG.gO}3]o-93&ZMI@wvG tHܠ}+4ƎeBLӭtȡWhtu`'UWyo~]qU<"է$Hl( ]\-[+BkP2]]$]@[4DWlMhӭ ftNX +Ү!"_ȩ ]O yF5DWln+tR1]] ]Ydk'"R;atZD +G؆ ]ܨZ+urvB.|m+"ڵBW֎>"^0]] ]Zr'Ǻ< iWF4#ګݭgMC'(ܻY6 Lz(W06$ '&(ǝ(vI(r\gyup-nq͵_+k|x9/vcJoѴ$EY-0{eGqׇ.uY3d{yD^;zo&_f77~➳vEw?},oenPUuFEoMCPH'|(+K%Yd=~L ]fLmaQF9Xx/3#w#KX8Nkw&EThKC#%#9#1n88til]_>vZtB{˖3`΃M+BK QHb(0EnCOXufрhtu,Z'ϡ@= #+Bt^OzPܫu3tF4CWVfDRR RtfuOޣўeOcQQtJK-?pж""ntE(%ҕtnCtfz ]tE(`@݂l8 ]ͱҭuntE(=׮.Vچ{ ] ]y$]GTʫ}2<$_9*haUKvڡ+u"BWVvE(-׮.rΌWDX*welAކX&^ eL9JhכKwf:祫3q(jvLW:R DCtE]lnF!NWrT{2]}-RF}q=)]`c+km+tEh+BWe)a+l|3tEp]3 ~tE(C`@2&]ࠛ+{(Bk;]J%.UZT"R;u$6>"KHWdhNpU3tEhϵ<82]] ]ym_C-u*":BW֍ed*8uK  nTNWRkK+j~`) iG;Tݰ~eۮ(z~Oit#f*{ԑ&mɶvڱs7S}e瘏;ʵDgeS lک\Z)-/ep\ZÔG.kr`w]㎃ϼ8LPƑѕ:էR8]T"FBW֩JyFpQiЕVK0Je." <`8ִBW֋ o Jf[V hc+K+!ɳM{\b+t:P.\JTjp툎]3ɠR0vJ%ҕ:4vjW76]mT*x]N%DƩ6S-TP" AI^7U/~ iS#n4M(v2W(4{`3t @7BD}ダ=}unfD_c&>ϟc ߖ͕YB NvCgr\>++oﴳn.p+n>wMqw{韟n5)W~X9K]mpv|/p4#2mk$X l`ܨOkqκZ;항*zR1>XjgM*V'M5Ya"A9{ z[V+2to̕0)^%eT>k^ZRhqF!fӧBTeZ(E1kQиl[Z=xMw}0qi8+~own>zX ypZ?avcm1Y^7tᇴ~!.=v|'^vo=[I/oԜbT.T:xmΩǢ@RM$ڸBF,Z ٺ|`l<[!&-v%+!7j)!Dcs <3&L괥Sw| e]mGo9Ow&ǘ.k]D!h . jM^YLԢ d?kH ~ȽQ0v? 3MaUZ% vKZ2ONM^o_?<bC/`ckJHZT+m1JW9Ra-e0[VFZ>f!eݛ&oϳW ˜毖o5S냵Lտ( <|tRNfTEĢ?Tl2]ԶN?rB-̋v(7(c +ՔBjnKﴖ:_ժoVkP:k?*ِ{3!0 > TngZ{_ ̗6Y| l' 0 ݠg[id`z٢m"֥d9dΖߖ1Vi//CGs{S,tۻtiSfY~:;;|<]Ak$M eN'㓰O"*pg Dw W'};UZ[i,uow.'?>x >f;me>OA;w=8 ?]#>x ]{H66$>l*63~sm1C1 },Xppq;]ɶ^꼑JV S )+>Z2 acuOxg4">TӘ4`4_~y~۷~/o97/?|KOX)lW+c]Ƕ4Z!~tA?C;w㞻YSJDD糯Pi|XlY7Y#R*ʛ-PS0أ\j6yC_.=X>ք V6 ;.<]2C ,*S->_lGXA?)Gs$(l9!}Y ך88h2!#N)8UXEkF#=bܸ4f{.EznkS?1"<6 , vlz'A>ژ%QZЧ54st{ھU x}o,C㄀e H~b?"1^I@mG2!$CYkϴE31l mv%'TʡUc:sBEg}NH2N[,E-re(X:.qpq( [z})4]!qNj2f^b#t]0ue«W+սnJ?qz[*T;O_­5z^f]h.GOOqD ^D HeʂOݣBA\P$ief]W?l'NnC/e~j!6EԎ\b E<@D V9o.AGoXc$_p>iv2+mGajR''qaTa/}̃"jk ZG..:EGz΍CH͉M7#,64!H d(S]Nc[xx6UBPRLCTFd<1hV$#m&bD3%U0]b$Od\9E$D>YIZh^I/.eTI)VuvbCIT "3uu{yw?h}1j}ߺ0 "Pz1yLqK#lcBOPzv::𧓤^y)k?mG|QJ 2M(9ѐsq -vG/ "|Ji`YYD0ETy ;`ZCV9W>,tQYrTr2#⹒)IBmj &UzPKGdjDA yzrd}'ܘf Pun3kPXk#_W2q4zu`!1!3M6r0bM*w^ %  Xsrێ߼QT7q0*=rSjn\ŘL)`Y+"OIQuw>ѱJGCz|*UQf10Yhgf !aSI0)svU$B7+ey{)yM.`R y}"z{X ;UAڐ}9麋3L~I?j\E#?n&iG^αov 8u=.b&4K *;~~MXҔK^ȤjAJcD4yܗU:*{d9:NB:LVfG&Uvd|kq+ǣp7E/}cd\sQn|!ffsP5ٲHY֡+iCjs2Jt^Iy!!'i M qB2!Uŝ&%VGnS^rcp[.edȑLP8.7qggz므5&a ~9F/mp<kQVBxx>4;u#祔C!?ݾs}<1&)O#w4GzyМ7tsfG<'wG=9ﮏ|~w2s qszjn2ALx72756{GcSlFKwR 6{V.ysu]ȥ9鉲;D ,BuQt)^n+qi^YJeβeM7c1~/;"iFZ92Rj")`#i,R:I+$3՞kن2 [WLIz$k7a-Q:{JK1hU'<-U<;ezf_ǭjwG ^J+KCp<#$%r8Y̊֐K6y̋&TM1{P9r@' hFZѥ9KgR\ ǹ.mUgeUZ#㱶V'+97>sz|;?MиEڕeO~?[$cG!B#J(0s"XI9!1pB5\&.X{F #ı\zb810映2&yX# $ lA,cf˄sHùL4})mb"w^ϻ 4yFڎEORO<PTkj3'i L@74pt1 Y(83O#uq_,o8^9:/JD95X9BC&1VP:*(ÜDXcd5+꥾1]X~U-l] !*CJ)jڲ1ِG.z Q& iZ9 E\L D%DHZ+xƜX;em @br+=oO;E\5Ow?|ٵ'2HxkLQSIHP4kWʣkYk SB 5: _y]!nt9~k^o>N1AYfuVlrǟܧՠ,H0p\`22, Sa.r4LP_uN>eb4YLxiw4U3Sep?i`߁Q no7(Wt>v^^~]ml!x>vN/fz_Q/%FO:~2Y #O+?Z|?N5oCVwr]T7$\?\' kܧte F,)\-_ͫz9~X]kf_Y֢eZ ?E/| x64 T{PdetÇ칦x6cT>gHejLjP#zk1xQCä<"C ';.,Dͮ6eU3gv.݇VؙZ)`;Bƌ^S#Y D902P$B*ׯ4>kЮh-ЖYG@rz?tENb{` "hAXQ4R Z 8a 3t F.pD%}>dBNV|v&_-Tu0, 0! 3IfiƵlS5>"i}|D Qvf]dR-%ivS5"kݓ-Wf9V#Be _paaPЈj* p Uf2XbU bJep "Jfdo24VG](w2J2&8 e$O6\aM9Bܶ6xƝ i/á4k~^t:m+ nX ȩ':_4Uq$i]xR+`81QJ))3Z`|gi(2w1X묽$guZi+]?K+^NoD6tR;ĬW逰A2?^ weJx->c]m(n *W.mw/mD0t=9"5| d)fr߳$64r΂]}h|3ZJ<춋O^Wpzrkep8mOy-גVukDZ'^:EEfם9 JMrUJ$8@%O;“(FU,qg*BWm+@)aFx+L ]Ew"Zz(c]EZun"ZNW%=]!] 3(+U,vvUDIXOWGHW nW79ܹAPv7wa:w;aõt{!]*Dy 4z+FTҨ$::b/v.|9'gWhbӼ-f*L~a/! ӒiJ' 9~E$yJJ(̠I'aiadRΓKy|-=o=_l:푳D'Yv\%<' pilVa+_؅#1'whPd,ySz 0Qm4lk{vc`5`="$/Z+L-|i"̓Άq$~z)n'U: #mx@2Β"z;Gg)0+XDZEK{߽.v
є\h&X%x"p]ؽIWźJD c"R Ƣ;Q0] DKNpD q+NKy3PQ.9sC y"OPrVN$kbܓy;ca*9(Ɔ¹ P8蘛d< ظԚR|z55_KS!u^70+?=ggtZ.YC,60EjX,TJK֯T.) CH#Jo|;,kv=?CB(|C+a:v(uҚ-tkcU̺CW.]VȶwmXӍMټ|@; &=a::Hw#}/K˶ީHUQ!TQC{X|dS+WM+,8i ]e62ZNW1-] ]啼R50i ]e^*گ* W&T62\BW( kJhкIC W5f*5BSBZ1xt%54LUCW=]eJtut4 +k/YUt2hE'2dtut\ŨXЗ.$jmA45iΘ]K)4tFپb8M6H~)IV.fj}Nwn%X'%<,[ZҜvjd?\3vjNAlAC;$m6 dcu$v>hs^dC$vF[i1F\nĖ[6<Ǟ-*v|Yj7tZs]vB fJtoS}tU{=vAxi*-]$]Q&pQ_~p}=g#PN@)`x zXS(M.=Rףg~vtE5vU$QgFǿ1?wGL[`K0B2\帋ap ;xz @d=qYA.Ζ~l #H<InnN%0eyv?'vTZKEoV|I"Ldv#q6L%/MkҲH9{>5s9'D #h0eX7ry u,88΢=v'VM=l0϶sH|B~p-Yb5liLRєX-嵏2JAXc5N4!AtBpͣ2ZNW%'-] ] Uxc*ՍGX;n;ݶtumDژ5J7$t(AHW M /+m ]e՝2J&[:AҔpդiXppecEdԝlgq&])jldK!/ =ǎiI^~rT&ʁ7gZ^)xSh ^Q*[fhZmP%zr 0,Gju wBk^1RjR-]bAt+puc jNW%%-] ]4L52`mCWîl"htQR cB`,ec*U)tP2"[:ÀnRt)kpm ]eԝ2Jݎ]"] N9kRtkp k ]!Z2ʺkAJr.e \ƛBW-UF)Z:I2Rgb|;1l7}?M;|ߟu/s}b:&\yMγn{qxF ir(%e98rq13{-^' \ J\姕@=qe<ބ7/ţ2"qTK |1˜'')FOu}*#aY`*}xԆ?{ϚGݻŀus^Qcžb(ԢJO|$9 `$tj`[])XnIֻXKo?=)vSlz؁sI|꣢Du cgm󥙞`e,h]7#$0 K8# %Qh-MK.;1:W=%,8a%,KwJL;؄39c_@]R/dBj40;q;UИiVSZ0DiH;G”0N EzbnT )GD C~N}tY2:`G%ދG-$ێTOoysƖ , KQ$*ըAE :d1LȈ1iJ%-(czfl $ TR$2\xJ<80Zks T fI ]"FI0w+bcD䀋QJ|Ӹ i`B9 ?AC@6l6iiVZD&AF;ý&G` 9h622%Sǂiϩ"hEe T&FRiEЁHOl?i/nOH=Ͱ{7ea Lt4hY =ڇގ&iޗ2];;QiIK0(;m$KE=_LpZJ=ig@c[V[\:E Kfѩ'b f^g}~< i|XjBSrh呔6BWbvc諏>T]" -dH,8QJ_+B4UrlcMcqN։8!}Ԁ2Ʌ|(QB[ei4T1HT4\9G`& 5I娬8 Ζq߂j쏇7X0gJ8ZmܹGp>ay~8^l3nvC^x72OU:5ȵF>3@h}9zt0ǧ|]U`SϩIcuG2Ydz]6 1e)n}C͙OgZmBfw Msb.{:+ªzs)(BjJCDN]HPFNH= E  R>ry!&)EQ+Q㻼>bXOǝ?E4sO;OXC\ЀGW[ ͪb4y=ozqZ6nlȯ8PsVdW w75x9]9~nZRH(!2 hꨐW(Yx ^/8%ZF*myj-1pbd*$ȹ[sIzRy}dj'SqJ}\Ԉ;2&a$ﯶX?x[[bn怨TOX̋4"c/KD"*xSjdCN:RRi W >VRO<0Q撗(|0'Ct!eQ0IpjRD(2O%̓u{-kj)hq\9>fA{bRZ h-Q[g [[_ᬏC>mQEMv Vv]lAV/UAفg7}41FTwQ9t4)BsR$syz|D!";ki|wk CUhjl?5M~2r*kK. OT yg~wmY4LvRaףx#v0dN1E2eilnUqBR2IN:>#hyUu>>:x/b6ؔYە_k[[x{8aa?3mG]gq4R6/IqBqٳ,Q6EU_Uϫ01/^~):3Lne$CI^)2Ag)HGT~ywZSD]۫ W/WьNյfAg}>)`)@]a<"a-17R 2{.0C^P R"a5)0XET,UJ9di+,h:{ftUg>?w4FD),f헟eVyv\b7d>~YuܐOh0Aa_:|gFäeRa: o wc(g?_P^(xx̭0Nc}Cq2ɒq}}1f$z_=Itx3)"/42㋟߀ŸCf*_b{}ˀp_{Wиϧd*[10cx[&R;s)stUӚ<,T=e:;aVNB:e_mVoLi\IU==sbvM, J~" jզzݦ>]g3"R.9v>Ȍ^S#Y D90QR( !c[_Z1w+Zh-Ԓˬw xvo`Y'm^HCe)f! R1&HK $ xt[rmW%t}>pW2[㍧^}]\ K_SRM{>K9¥kdIWf|3K2[\~-dN $"3ninA$-kdgNo's?.f2-e0E2ShaU (ENX`>bcB1$v7]a}˛ՓC~zV;֖KVH82+4wBaPLfS", 6_^m+D]i+T ,O/ejm=ne>*CzVi0K[D < "dʸ9A0wP>rrB^i¥Wr0ڌ`.N,o9bQ،7VR[]ڌJ`o XeQ,`K(YG/3SOAT + .şB/HFL!m+ztYWl~4<: bb!*i7W'{]^*}RɳΛb@=Qݬbbh ;}2+7Ȥe^ZvbGpcv5Bi3~>\e?7z)mv-^Fy 761˩2ps-,1qĐpǹ4Caf(Lw3Z4DsΚMkYƐLj긶&*넣RyQt:}3ׁZAtRDU ޮWء=(!niㅑ;P+W A)ıKy8HB&x[V[#gS+ez8unblagLt$a[t!}m&t]JV_bRR$Ga\\&2rr8Z c'ͽF:Rb&y$ViǬQFYJy>%h5^#vYъ|W:qX* +GZd#g "Tz,ƭGr_V…= ֙\5* p kh2Xbv"\B6xјI&r߻vʵUuw5KkLjh,6*N{QĚrܶ6H q׍봷H'EzuöԂ%'(R3Ew̓_Dcrrwse<2J>VF}kS+)3J;Яmцa"`*)b+t@HmoHysQ bu}KPy[6n{}r]ym󅌷e%h(tK['|(l0ȅ^_ȮDm%"fh)ܨ|0l?ZN-r-m7=z,@"ܢ\eLZԵp:BeOYw8 :/{tQl  g̏(2^3%3,4*y%SBb%pn*#J\OYގ۪{q/Ew^J2>d4YVq&Cb/Yp2"w2I],oزxoIiYz¦ A)5ʛCJ{™F)\Ti)\ s帰lc|۾niב HitL*"Q* J3p0<i{z,=l>Cv uRŹ!2bn9Ji;K8,e`޳9w`NDTb*vIʎM'j,X`}S _H.u xB1"TkΡ]c: +yc$!T&Ѐ#v!^ z ?tIQV`xrvc#>N,]|k|h;0ŀP7L0Q\yQx6\TN b1s{R=H_M'QgC PSK%oo鲩܌,[X>|.>^v'wuN5w[;jS_1H8kul^!aȥWX1KlP:/]//ۙ/O|<|/߽89&oo_</0X5@$: ,ltL6CKYjYPu\siZV])`yrf?w+e1 8}0V^4Ule`v9j zY"DH3\(N@x Z+_ܩ-cHx\G"F42\ :u:mL@1 89(1Q@${kj]48z!Y¥ )֪hbb&c*#@\0Az \E٭mĚdm]7}η*>,B|VÕ'\I+Ubn4VA1Qvy6ĽGPB$dkWsc̱|2&ҨPZG5SS17U+&c=2t),,yå T3`MLjS%ANQqV;"nr7O_jޓKFӇ>?)t?_+HEz(!o !]$q87=0Y/Zu^rÜ9Q [cI"!m6,NCkظg-4f?:%%S1Q4A`z &zXyp+G ^ϭRȰ, A[R.k%%RHcZ#pRh;d]8s8_c'wI)h `n瑠#֡O!2htu\̍30Kmv9#RHQFXS$ p1޹\x p8fjYye ʢ(EBc aX 0#ѾU(Wb@UeqHgpVA-m=3{@ 1Bs4 [#~(X]:g.,ZO~>B6>k{Z6v};CcvEǫW9e WW ہm3} .RvIT|dˇ:nZYK0X`"k"QKEVSE57V"FMsFV;|||$y}i݅u.Eh*#0{uyrOz$ |1[b'鑓pF.\zs>ڈ`N ~= Ȩt6#W8o^ [Q)a2REº Τ\:onw?{ײ#GwMD?ƒGb/:h !ZBp@rfweQd7bbU]i?\ Wpxa־J2ݔXo`?_,eὌ{ߩw#kNvڱA 6. {:T/A\#ՋYXkǚ§vtn= ˓رSqDK[U7ǜνrilj׈ltQ(}gwP@.Q:QO[IΥb蜼Ep9t _etN=|qA//D] g΋r,?fQ|ؾ7s4^H k[N|BDzS̽>vdʳ ]qH5}繷NKշ8^[5ykinMOӟs\TyD燥\Kcb[Rk7U߽\}>"}jN~J>=$s+ 5CaqD'5;~NfoK >eZ.]؂j*$EQ8Ԟ =J]Tݩ=+"6w?v~{*ɷ[<6d1YF =63-ƀD+(ܰ =rђBJ;׿~*ٱ[<6;tЧx׷o.v\q/GǏ.?=ry$gѳq)/?b帧Z~kY:8lv`vr@5ȇ !sC_{(}]Ma]zt%1ywNji'ܯ;DNP)ҕg;;,g N|6t5ɫ2EWO7=#·&\:fp>Q-zteN =σ |(y'AWԋtg:M'O]43Jg.M.?"nY| kIJWVсw~_4[=x ! o\^?ad_K~hK|fvhdPncX54Ua|I Xu?_72!;UG:NQ^dQ+SN3ZX9殚ϛ̪ek#x(%5%sC$7/98_-g@kՑ|=&/=b'܉$5[F9xd>eQ]BZJITJJHup8B_2"$U.lH$EВk ӮuRdZ+֐N92#s¶CL`KTD6:H,+s:X"shGlե?P.G˂whb{X`=r,-ω-@EE; FuZS xi@)ma89Ar湷"TbUhK/9vz#u0cmp-Hģdݕ 2 OCq[0ֳRv2u k(u3 EiߘUTF)݆YP6P[ s GE 8j RL;N-`5C9׀H ]<;@! mB:6@IiO{'4WSO @Q;:wAmTzCe! 2PhSz+ DcK@ -eH,5 Yo-%D$&X,xDwM0,rY ;TkMB`@ 7]+U oLW᪳Y f(%Ji ] B!+n>(h :?: q i $J Jq6Kǰk˼"p v nitG7%X4 N(N7; ztuJ;h $}o&a1o7} }/A V(4:N1m3,I8X!ƻK<"KJ߷`!ۼ*IWa@hͮ2AS4<(vJDŻE ]!(DMk* C$w̬ti,G}y`^h^, JWBd5 \x ;BY;^ ,PuBOU䝸!dїim!A%N/_ߗ#) uv%ZsW:웗Xk3td wP>d‚/X4DEV!5v`QmBgWin^(-ZSd@sh bCFgz8[ 3+~>qV fNԈ@2Zß?~ X?X>5Mv]#Stf5O퇠rk<[ou^I""MQvmw;$as<0/mdg`n֛Вl8z+pO}q x8ަ}L B]"x@ɐ@Y=،ѳSDH3vP𻳫EPfqqұy5[aԌ<{ifc^veʿ>@~WFypл!)C^T}?K.h727-Mt]U.HwTP=`h}AC8j*nn-OR n+zZ(#ܺ)}rp]79F~zbE7V1^8y"U KgA5TyDEbvՋ T,~x)m\ 1crݨǀ(=@ST-ڼ)vzP{FCLzd9)ZI]SoG60/yM[y.U kA ou1`9 vZH>jԋ+siq0%#pߵ X5jh=Q!,c(u,qÙ  O K`R \U/Z`e:A5)Ԋ#)ԅ. XV3P5>UwWi#- s^A?<^_=[^~wm7P(q퇨[v}ͻzw4k?UQA0>x( Ky.g7͞;/:yΓl_l3J99DWpşh] ] ] A-)+U ]BW}+@IA:@Bhc++l \j+@kic0xtxM}WY ]*h;]G$4[YOU,h jI HWHWF*+#Q*pu5#}+@I IWv1w-p~+VEQnZPfOuL`ta*` "O%?%|oosl(d20~*ة4njSkQۧZ饹 GmBI[S7~{C_ ܁Ѿ8¥ݘp8tǓl'c,5FjʢE]^ѳҥ^:]W#ҫ%$qs?=O*&dI~Pj/rpPo?E^b8EA9 1ǡjp70,)maKZWһr#!7%NbcM09lYY,:.[dSr&dMlwHS !L9Y?+;4[4\9hȍ6^z u:gXi_o-˱h sj)[ Bo/ *Z/>îZ ZЂR,ke=%egKBKWݳGbC{Hl;m芒Zm* `cךZ NW%HWHWLiAtEtU [ ]BW=]F#] ]q8*YEt \jֈrDHWCW~Π]Ud-tUЊ]C+"*m=Jت  ZfNW HWHW)k32 %uU4t(Y<HWCWYUW{T5tUjU NЕ1DkKKWHweWe]k(ݺ;w4 "45JBӀrw6PYA>@.n`h=Ap˪[ЊޫʂR c6loW=m|!J޳tHWf=ZY]`m5tUj誠5t(8!ʰ k-w棍NWHWHWp#jRW^ ]uUrw*($|knxr] ]`AҕDAlL=ը+@kiUAP]"]i*Uk&U BW}RaW!ҕ&2`UT7+QW!}+@ɉ@:D21 Zl~s/{_ kgonktg~oвa:zɆ 3ۛmc)^ , @}lɶ;3=6bI=܆gԝޠe#`ugoimR݆2n )I ( =˚Y {K+A&e*j:D \Fj)h{HA)qtaOzvL8y-@Wvߣ |=gC6)VUy-tU}[U^L`*p=Uz3R JUIF%ۙ\x8=Kqp.X}{tr.O$NOn'&-左>F92JKH54g0l&{%.'Qm\ cKxʨYo7w7F .NNV$>UW|3-彝erUBc Hq% םy£Ph_?m6wZ}/W>lur㷴h3VY5Llp5nEq8cmubH=-pm_~yLˊqf[lݴ~>9b7Dp <覩d.kqLr࿦gIuZZ!:2DEp&cʙJy4@TIn߯c( _y<0|wYo:wm>u^$N[86c)}䝋GhuN,\@vKy9&+WxZ[)Ix4Ojp:{|@<?1<50gAuQ vnM;Ej*ڻ;$籷y~C< L@Lthdab;^>> !T}>^).[L3lv1==9tLzj!t.b?K3}Kɴh6vv.sw${hpպ _-fQ|N5460론|ex?Ā~K) Ťmry26m&Fl.j $P 37xYC2|zHyuu۶P [TOg͍6+[<5 kEu%,/:I]IȚ,ogJ+Q.\Kn/,"+!d`$PO1&d潆L+ Zig8w. = e:-%C)յw<.◭$=f %b9ە3 +"mLSLmI6 Ʌ̤PNw1l*e#/eʡA>{>|h JPfpe}}?TPZO8ĕծ"VÁN{?.?MIp2O;όQKlhqenq;B0<{ <{SFV˪7fF-Fu;ή}B}a}mQeFJK.,'Re K3Db a3qnI!4O`5ɪL=2:pBF qJFRT<8r/G ^Q*vbus+N?]'CwOhGo$c9_>eTc1Gͯ5wwuͨ#_>ed$/5Ivm;NUjxs]h e發o &"eN!OGms'Sfs%gx&"׹Lo\#n^fOH&(KkeAklu+aWh\(GuN.&eJ-Ld-7J5e@`ehUnWP\MIRV^RShOVXǕ3]Zn7k)bQjY""'j3H#s> KACm$1'R Nq)4>`$ΓxKeL$,KSS;}/¡=Έ2/ D "Il=;w ]MrCa̭ۗS_ukziDR %2fcc4+U&iE6w᯦c=;F )w)jYIXF]육dCD}촞9׌3e 3 ˂Y5[CB`YAL6)iO8w\vW|lרKO|,a"232e.2+9,ŔX2REFb7hu4**pQ?]m,','D)CJ{LcUς(Y,4& ) !Zqpbz~I '2 .$ч@J2љ "G9TީK]5qvd`}!hlz''%{(b;*&nǬ[z^ mȝ~Xm^YNeUxeƴtqqW!v!xAb_ۜZԺ{Һ]= dqOmu4eK-~wއ@C-geO'y8կqTp+0n=ݝ˩~u]w6$S]yHytyיs`L[ Vы}PS(ָ[js+s# jD*Q?(erh]PdO&9g(+" L.aHKs^CiW*6eb*Եstlo9,5Y q)")LjmI3KL&uKdqtM&b $ %fvjZP5˨Lfiw™iy癩0uLmXo7 0Py/ R:ŚViI Dw̉l%=fA`&LښcYi1 = -ZE>EksJL Us?2ֳUaa/XhG,<(8YxC^7 CU֟N?,9bۀ"}2Y`= u)I/_d,Wb^":x][VXTȞe*p l ]e4d5 JAmHSfhlSĹtQ̡vٱ'jKVF- 3N2*AJчH 2<LXqpYiĐ2I,E25110dpr9 i#'% < 1I' F$HLK#$SАjrUsiPuk͒}qWE>∋[ bhTF;qG (*2D:d^* F\<.͎}PTC*w?c!9us7z?>#k̗̺wυ[~J1JlC40s.)?Q9dˑh>,Igp֟o76 DWa#wL.smȘa -Kya/!D N`(e 0zn gdNzTۇH-JC ӻ`'a{rn{IěV ..jySGP܇p+@!!Ԏ .XT'!(" I\qc`ԈA2i< ”rf#P<ЗԜKOd.kDƔ&u):Ð w^nd.|A<?}_;6KTΖ׍"Kbtzwed^&ߥ\^޻ǹŭ2?]SNR' 7l0\ǦDҘUGMLGouMc=1xf}L-Bkc6qVz5+'.3}rͿ3/?%899]6|~e_r?b'ҋu r/8ǣ5d!8̀0 Y1l rSAԳ!K=kׯ?.|Se);9(ݜ؆nnص'1ւzӵ;g Uf:_.~}V~|e8͆voGy]zOϬ)5>M`}Ğe6M1~ AL]}|{57"g~1w[Ǖg}tr)[Krmٱ=,5b,|:>=zO ᛭+g>'l~rޫQύ0"FƔTxF&-`& f,c>α8lA'ξS8‘(`HCBvMoFnnZ<rXfڥ%aCvWuԼ^}hVWt7Xf ٝ' ^YT}y&S[ծ{ll.gG}l>='\790?ig~Z#ZJlc$ V @j+}PTc$QPlDa;P2d9@R @x'e"hHˊ$cɘK 0XUuB[mզ-5lyڏf*&^2eh]!ya:K'e&c{{QM$΄m@؈Q;Je˜X %# +eҫDl]$ !3jt>joJ99E)Y`qaTߌV9TRC:MOOba '׶A$'D^bz]mo#+-˵M 9g= vv ahF<'$kky̌"X|8ؒ|W,@,y1g#:eJE)SQR*$ȫӊ?9y| ! eU2#r4֤E6P' X!3q <6AzMMQŽ!Jd5Mai ,d9g̱n,Hڔa+*J6 uyJj9cb:"ʐr}``` k&U6*~I^-L݌bccA<$Q,Q^֢̝Pk,6zb$U4A2t{#`,{ ]{A.۬\ Á$ܩÁ˗lDC<屷9h^ϚDR@r$YBZzkrs]6|N4}έCb;\(=''[1(A^.CawG&C9d, SI8H(c:9I>j=3>0OFXmܟϱ͓aj0>YbGՌʕ%VAmW?7>ףEOQ1Ϝ\{ҧe[ߎMc+n5obn=6Vb[Zdi9#yWe*/[Wv6j^3W+cc*#~3^tUfIw*0)֏G}A0cLJjLLh.QWLjDu M)¦FjrFEγ,96y̘P^rp 4!L:AZ1cRF0)(4&˥:'E@-T#5t {>w&x# p+6?R:e1ǘ KivJh^@؀iOWѪFk:ͅPYA9vUnqu2$&DdyI+R IW<Ƴ I8+S.7$Bā:-ւ`I5W(]%3-5n(!q-Ox7A]xV4,] =cAȌq-c+q .8j~pOV//c vCE^TDUw ..[g$)2(Aܦ5Q?PdێM+Z'im椺ףEj^&D̓Ww GhJ e0|d;j4x5k~![.ښf,lTH:p4ni<'/x4Y`ssַd[m}@&i}RCJÊϟƓX&Xƹ*`ux59J̈Ǖj+揫8;-9V5#T#%ԍkN Ҋ BlBfQ3H& %͡dOgT;ݟHu6Ro;ʔoVi;ʽB`SJK-M.t!j)XQւW@ɕ~:gC7 w?,IJ&tJ7N[m ۻ.[a{'i}g{')~~ލl~C"oH\["-}"P-VyCpE+of"WEJ%zzpeA[ OT0Gq|WLVL)qDXUMƳyDI>.jN/W {$3FkoskM0<=s0l|=skJs<=s00< 'ƁB<}nq0=s0 XZB:[ZWCz|8/v`NLTV1Z:Ia `92[ֱ{ulhvuuG |fX uh9`lHHƖ P֛BQӽ'S `CLPbq)Cf)9ȉH*Of6ax:g(>'̎lrӜOku K^ެĕXOoZ?ߢUMo3ÅM~\,^W=5uҨBm: f֗B.~\l'/2TMKz^z}4JWe6ptl*5uj o o|REwJxM}˧}fΏGym*N/ yp֍ߵml ˢn=<]-Y?{umn]wK[:OKczaܧ1OgA.U[0dZf#DRX{}rR{t ]ĥ9GezQdxF*X>zK!@Ywٻv[ {:;%"tbfȹP4yL2)(,2B0C@(rDH}fǮx(:Cu&9<7q]\Jُ swظ7MJ& swV}g.R^aw=3q3qIkZH4>'1jAhF. Li>^Ζ48$V Of>_ޟ,Vž}J]سhb4% MHJ$,:IB A^\^Lz`x`1-L w@@!Rjə:)ǹ<M0ǔOTZp%c(RNZy $( e#t F!}BG@9l](,xmTtm vP'nEk:5ҟz>5ƁGS!r&֒)$LefRt&W 㽍6@} :X<x2;O ==|k#tƢr"!91 WL~/PಒSZ̢'cꑧ)溏2%8_9޼[W'F6  V+!r,hqY`g3oFeoZGC16 ,kn6 c^Ȏ.@(*$ 2wFDʵh(cT։,xv,a.;MO{w:qs= b!z LZe.ʩ+$)1!ܾt&->ֶ>nU@jD͵NIU&7ZTR;5ǩ}_͘f>J\ȓjRx4d俬iִ:eIS^5J n"-*lU!aW"(hL#2Om!"_m!`lqΛ(zzSď Iןo#Lq ,8%J{8rr?)EʇuR|EGZydWgU8Lu+iL[2K θdh̎NxS< D\5?N I嘣 g"s:(at-d?rه8y\[n?F>t:ŴHxZWʎXDvrr<;YgBF(ns &"/lPGN$U 0zkO4\^\kk9>|a~*0b2i!r2g0reW#hyWp666 #6ӔʿvURyK)>;ł@[Cxm6m> W6yø:.=7ZX񮼀.fPxZ}q}%d#V*OF&^B1KKgR)+ ,%Èg%*\3Rlv6,KUsϿpzP& fQfJQ3Ih4I*lw:VL[4c6dbϛ·pys(ϕ.4FZ_ވ.KHHeOC{(ؔEU[r\:+W+筭H*h 8'pw%b]W<ʽ2HYG"Zx1ȸ̨%3d,1'b4i-s;*:l'NKV~vd4$}>˭$p܍ ):"51 pjhUiXYDP:Mi (Ie$0yD檠JA&:H_CʓG F9Q2Z[ xQ !t\yZI笑9R⾁PlNz9iߛGdži$]͗mmP_&y_}& ##OGTv?΅]M /5V~#yӡl4߄ }??>!qV^oN>wdb԰@#v Έ/i&ifeҒfgZ;Ndсс^#@J <:w/]Gmtf|"ܥ' siw\'C_ʵd~M-r,6jqHd|26ヲySHIګ_t|#j]'˩'ORtٟogE9N+? 1 C34!:] ke* *kБp3>fz(gzhgz0g=, z% sqL)CH"2R%y²jY/BJt^$tΐ5E,B0_mhvm7?lm:;J.9w H7>x5%nWtRa+HqڴUSzrq\MO-R261+[9BTdV̹&f,?u?U}[WgkhVyd~Luy"Ȥ5HAP0KN<힌  J61M ܝo65\z ׵aU_SSl >o!HО2e'P Y't@R.E:Mm+'Sm-ln]0 =0cp&kQxӉ VYGb2A b"B [q=a+$ V EH8"<,e㒑Qf" Dmcg`+ \ʓsxhJKuiքO'zhkZwI0A2g3Yg%g\LE$VDkyt{, gcX:`L`3ME*2 MFM ܕAnӋa<0KX2F/ `bH!gvHr~& l_v1`Y`ghՒ-KJ-w{1R,*9Z-We"39L"&S b\bqG]-XX.gÃ9<Ѭ&u 'c(!ѭqU\/ə E_kw1)3zlR >-?t⦆ٰ nd9g艹3sbsڀ.ݔw7:|jC* ktSr6Nx*XJYeջhR49ȴb EޅbҨL,5nƱ A|sK6SX# H ),a!Z-zytCа܍i'tTvn]ぺ{a.cs>Z/ǯ8QmƛzBe5V YG6U@q@4.ZT.6=AoDql}ܑsu]B*n3y!~c~ewqu;zxbB)$J`z}seJ?bj ,i(y5kפQ jA{К%`Dj\uت3L)ŋNfŇ!ꡧKz#&-̋+ =7:ܗùGoDݞVm}ޘw| y1oi9!]ZW^?n-uTVrÐ~MJaѻlfq~s]᧠B+ ^T/gdgh j&f?h(1^Xdo/ w>5r?>:Y^7NKnBro[\H@>k?||(J-E#tReOT ۜCH/MJbC-8 `KPZNV#W9JKRl}t frQe3ztm7W`{{mU{k_3{KfNdY$7HR/XHrŘMZwIJfIPRޤ DAƸ+]wEt쮾GwŞ/.s'oݖro{J}c/vavs(j __AÂ4u1yDbjQc*"JYyK;GsFqfSNyby_uo~uw_pM?˧19o.s~zbd]6fAׯC,'7"BSj҄]Vwҍ<ȵжc?v.±]4~w^{A]n뿮=N!EZkA\lu{?褿N{)_9?~8|np O?CR?{o="SYvEjgYvEjgYv݈ǭ6(FAѠ 5ܔs)jOVLТ),C{֯֨fC|]@4d \L֊/o)ؘgb[sPk4Z[2Hގ! )d꓄=TuQD\*'1F z }Qx/qU$ Hj|nueSYG(eg;-C IHܖ)9m:T5 7;R4JffU2j,xwK1p⣠E2bˠ45Y1HfEӢ Y+R0y+V-;hmC(Ugv6l9]}}fvZvpFշE][5d&k{ŦbFK2R(r7:fiGe8{R^ ZKPX)Zeljȷ`4km eZàV>ܙy̓6DEZh zd}SF,@ȮV2;AcQ_aa{b݄F 5UB$&tMOd)J"͹ 1]FPL;ψ~;SXEj|8Ngwpsؐ5@uI/]5>W t"\UKympY+ϭzAGiNYT\7_C=C/K]&x"/}~jvխ?7ewE?mn;?W31gx?tXbC:Nϓg[BjxHO.EBz?3b"n[LqGK=.n u%`YhnѤ)d'gl=s%lz>c5)FZY(xXMTXƎI~]B딬`ISħ;/-^כֿz9_[zAp>/}< }NQch_*%pIɓ蜝bs9RxiAx?yih$unuv S'hR6l}OG=p-߅ڛERq` RˮIo3ʽ5H4m#2'Bm ֚| ]b>)V*բpI)~t بS ̛↩׈)yb6 j -f L 4)ln䊸چ&fbɔ\@-(Q~M$EPRcl ÖS2:K`Dr9:SqZ9Xd*Ag]`mvXgcy cQ⻧揋fy&Yo9mO4QB [/ Jf_3A$)3&0{b"Sf7ڶI)|twmuӦ+gnK̉JIMywi@z6BOF7%gDGe~f kD<[)!8%66 $@_KvE11ăN;^CXsȓTl}frQe3zɰm7W`{{mU{k_3{=KFy K1KBK$ V(Û"W^zۇߍgoDttN NEQ+x)AS} #!7(mORvvr(&IЬw*tŞ(OՆ\jqRVJZ_77F:,\Ѹ`Ӝ {0/O^N5r3"B)?<_6,kIXrhӡ311"ilt:F,`b>y.A5R{C{@*-Πy܁sgh k75q9.MK},Fj)eU&;e(B);İ|Q2g_>z5sS19<2x '$($# E.5!B-o Mw24}8gҴ1>mzPbfu kxcNkesR Lc88.D2ĢK.pmm%e?K RBpoA)1)*.cGϕB.2KeN&{OeI`Ś_6waCm>eش3COE&66/G!&c7Dy^GmSMZmUP{2j#AvCBdWLvd?Mv-M\3H&{p.h2K@/60q͒  /Zgr޷ɾi/6鷷3|tǏi(+FqK. 28PgH9JL*VDƯ̆ I@:;I5/PE+)nNZgA퍜݂%/R=#gNP7>9hgh s=rBh4Y,- K ϼ[ң 2aBh)Ӂ&䂠$>0U 72pUR}#co;7]?,3B3  ,f!]E{rOf]mPxn|1_|"f[Yl@2 3:Ryk%BhҌҀ<}$?}t3#dzznZT%+f8c.({"ԼrMꏲ}#wU 2чP]D=J)۲ \&8۟Ȟ>@p.Du Bf ޗ:Cgu2%{V y,aT>{Cg?€dsJb}%dN&l1$=6$*1!͹3|846>)C?@rKҼnT5+ĨHϪT&DX~_};쾍In2[1M˖;THnfC9sV|׍P;b(XH^0.[dmCLgXwtU.M'M [ ws{7Uejϭ' Ŝת ѕ<^Ve7/IɾtTʮqjwհqFuU3)Pg V Bm5b֦\bdҫiDMÖտ0oI\qpK*SlE˨{dCe=;rMh Y:X8aOpŒ2W xОpזԳi*_p^ͩg)LxuFw^wsq>TǛ鮭 6#4f9nܪHoϪ_< a1Wt~9뎕?XC7 :y5.0~I9ߜ:|ϪrG;J☓?鿟hz4DY}E|W3@|'ỹjVcKtaܽ4͕nr=Sd GrwYiŴś,GjБnlBfY}Xd9dh:\7D#-N~m-?Pkl=Z/r3>Bz0*R )e8n&5v>%˕.ezT)U@ԚrEz1Тl\x'QC5 b; yQm3PwCʡkRyzcΫ O6<,7r8-ԈcU/en%q-vg^,ʈa1cdNC%}+kb"]%0=ޠRn}=.m)",hy*ˢ.wMkWZy-BF>LҸS }wt\],~CyoG+Okݫ$20s_?1 Gp3wdۺ OK]ݠ]'&۝8a*NUg&']-t6sn,NѼӯ{Z<8uw#TeH:>V@8Y,Z!!fL^P9'sEZ:Bwp9?@C3s gO =Dd}sIQ`#dllj4ׁ@bE'g1k FGsڠ B>w$7{b~£Ǫg➡T2Z|X|+V ,:կe0m7{7 BU/nWnU] ^4!mo^LCqAGl|ryUd\ȵ}ÿm?˜A%w%pArPf$]* 9k;r{|Zblxqe8Mn:PP0bh=$]ӻ6m'O: dZIV@YLha_=ר=\MhTm̫fqfyZڵ6r=[ oj=׳Ʒ_eXr Z]җqcu3ݮ̦o gU??;!Dvv.o^G V!:C* |;OFÏ<||3yOmt-BCmIEɬie\ %PӔRu+-nJX^˝5M:.qسq)`l#BIheczIo dLZ;&2GS1|@ zo_2fikPXeeM*٘$ Xev{c9s`N/&8M 0ǘ.(ɉ׬<6E(4 DcY~3?Դ 6Cstlтy%WY&0c Oɑ$69`~鐵D0iyg^kJH.p f zB"UzL]ߍptϾ3lnO+Z,Г1FZʡflZ!el&nJ/4|VJixզrl05r`EiłL_5#2"n/;ybŢR e QX&j:&"o;vDI9N`GӻNن=C{OW'{h$, s'7+GOň↳W{U--Zff,_T)lᨵOI*sYGó7/z|C;'ítrUAY;n44hctʣ, FyP1?d?tnF9aA88ӇO~?÷ oi+X)A"{}Xyb[74lsjm/wPR\lf[we֚NSRsH|$O"Yuײ{!Gu6]z: hW.mT 5!VV6 3;.<@tVR_<-b'H|&Hل5I!diy9Lʀe ɒcq J2lɱ׌TGz`ÍqiZF e|2&!l YUP$BrD@$e{:+'9aNt9=yafgY\>ex%p Y{2Ԍkl\A:R}E H4 `p*YK>|vh#> \/G gAXgd -{[=uwdFeb^uw4>I[?aOOҰ%(g21hh0XvH(v{wU_5eq8<\_b $#9gEgn D"~>@dfjMjtZiB=A*+TAkSQUP 7p4Z 7 B*NY\i*RZ^+4W tGҿ Ta};x?&[$9 ؄XU-$ .GW[#MippU9ѧec} : 3]6`t١iVqw3]h"]?]M`a0їY#P3>MiEsNr2Gôrou֠QޮZ>ЬG|3I153\2NjXA5oQii/T;T_ݻ7*'&q WRof"?,r7wٻV]gxADm5,< ظlcC5>LY*sYN;t0Y5R]Mt9`6_eHO`b80Xqb)6ê}.69pP[+n$0Jgs4qaOViiRڞ3Ts ƴ2W$Q`U`GHwsERZnz K;Ȣ- .LK T+i)GVmϵKUTdl|v rѢ√|KFݝۨ $w>N7wo=M RWj)AmۘH}+JyH1I:MѪk g*șgjҥl숁G]\G3,rmIJXzXv)42N9q m PfZu"+ fZ$Cٱu&ΚzֿP.%ug/Bqh|W)(PIZY)dm,n^UK~)jװJXT*oK Rԁ&@c쵆l`s"XVgP+\U?Iw2vK~I)ǘђ PV KrNQ+dp*9BVCy2lI%VpVR+A9pRbHP[ H ^yDd+ix@|MMOu|D#G-"Ck5\c"h5Aމdrd.' XԄtmUcƚ0llcЯ1i|9H>[fzKmDDIH&{J6FsYMe}[Q.wq* S DD_mw 6J٤QfrEƜu,0#˜FL%)/E@EJn:v;˧ݳ8CGn_W+o[UM7fγq,9Zdw/Yi7rqh?0v69vk΁%p :!<()HSFk4sCd䄡EIsb91w%9²4X/BJD"=^$ \g@! FNtLǃwPԭ҃o43/J3;χܞ7>|ڞ<5ɹ(6 ˯GCm\wwd<4NJѸ,N{҇WB&ػcyAl΂y3*KLB6^DL#DbRRx)fև`vHN*g=OGn̽*: WK8JCY#ocb{mg4ן;0sQ&Q\ yvGDZ]V8vJf0b*^^b]OgtyX{:Eh؜c'`"Trn Ye.ˉ*D ':/Ѧ%wTһ\\|mɎ0^FA€ʐ'ݣA @-LF :>9fa#i'o'eZX^u(ku ~K2ka<3ΓϿsoǿT[)/5+QFкb) K4$&%IoR1Ȓ"9WDy0M{z+呠@5U᷀b2ɿB`M'~%\wF/CwŘLSWERՋM=/߹߶G~^e~ɔ M0 CBH6lS(I2S&I0>n,t'ɕ0 =7UFO?Vi}[>*D {sNh:]Zw>f7'EҲx\7&iG4ʲJ\g}ZY8` z3cG=O13Se!:J֓/LϝY. |}|qyz=~>~MҲ7Xb>]L @>k=dЎ%MO%$-j:Y:F+Oh#r_ fުЩ*{ꄜ6w2Ip6 :>qm6'с*hHr`]ؙ8kRr4ʫgVPr(+gRqfj6ۺ>xUOjJoHQ5o<(yCI; zwA\$YD &(c9+)n_ՇE91^_a)魖EiOpHQ)fc+i몞:/6D)&4(4Kf9$e ?`@ w S8HC ڝO#>rJZ x u]ɂIɮ7LO&kٯO-mv/.u?kڜq+(ׇha(! Z?ԩP`.Z)0:IhRu,1 G+|a9`t̖ ӣN $Xo\tsPe6XrNAqcp\Ea)9S,G5O N-_7e/՗:U)Nju˛cl1kkܹEÓ56̮>vv}v޾u'!mm r#m޽ԗ]S c!o}yl]cu=o{Nm6{_ͣt-klYun7 o?y{,E+-7 Xzϼ͚Nvo o6}^ZÔ `.M}S&u1V_U|Jm%Um_6!Pmd1K4OO]$SZmUɾvhZ0cR#5ٕɮ?&^X7AҥZ""cvbϱawf@JyMrR7kܥEXA t;ᘓ:[zp9X͂9Sf֔ĴM"@F (.s椹a}A;E0kgJw\ؗ ]\hG.<(ys}C[@Ͼb7gll/"%3Qgc5$O@4Z\y'e;dDVAHA)5)SAdPk2HYWr\X4D]ytVL̾vgcOV[nY{`׬xuV DRr59d!m#Ae$Vq1,y5CЊ2䚔Y" 5*QxH,Q:_gpƨ_T!b #v>eD1#GFX*xwJA3#;pI%@q5D )Cm*g9c(3(ɩ(QgB ѪFf#ZB W1ut38[S;8yq\Yggd_^yqō9}\b v uG@ (ALTfR+p`HaŃb_ձ/ʎPLJݻnfW7MNp y?ԹdŗyY3wG.yVZڇq`ι >NGh{hb!?PC<+BTSd0V΅re'dQyJd.5:j+?uMO{z,}s&mABdRG 9(ƢLB8G\/ aZ2a$ Pumd [?rVJ <sE1sp ٤[gJn(|LI\X)+w)S ^nSfc5;41LB }b#ϼ]LSq|<"~zJ=[lQvjqL> 7N<(b QEU 3):SF$W| :H<,Z[c76D*)C @ hJUXFyH]#= /z(':H VkC2gU4; .(I}/FaZm.RRԵewHp=Ȏ.Hm fi<䂐;$#FsT-+E.8HҸűri;4kZgt7aioWSkJil5  E (~}J@Ӫ\Gi˨e5}<59eAI(Ϊ{5P2I5ܗRM߿P b pz]'}nmOrm.:CaH?Hۆ紟?Vsxqߦp>j"1m:(վf^?^Pٴ:$> +ogv|{1#π}̦yK-JT^VO֗{9]ua­Vnw^GRK;Wk]T45O+U : B_ϥu_w]$*t諾~6)Shۛ qAT"W>!=oj |5y Ao*' [Glr+':LM:Wr{g%zz!xFI(R*fUMl]I{仇_Ƈr:mqҕtzKVى8\r޹swjnrO .WRa`fBkBnHO,ۺ ~:زvspE* ΰMNxi͗IL el.v^;fv;'jo7o<)`3<,YkXf)碼IoTF9G[P,p?x_?jOG:xךּY Y'mPAsd<Ř KҌ!L/ nbMz&nsՀCJL4~]^#vh佰 *Iʊx,J5j1PYc.;΅䂳!11} nazw7AKzv7_鄳9"yE D ̨ sȸb4`rck]7[o@S=Jmn[FelW:K۵ K ?8J4q(j=d@xjݫ~?7/W0,ibg vsW8?Uh Rߎ,_\44 9YG߁4_c:^v^h9~Ȍ!c6k+o;?1尼^xὥH^ 7,uR4|]sH^;?ixm.`1Qܩ&o&9Tc5o +ET/ӛ[W«G嫪Uf{Oh7 B .wC+B+mPHWCW(K`@t-メ+PJw"Jtutetb͔ʴKng&'w+ocΟ 4_+' O/uu57M8h}*q-$h &CPhJ;MJ`#M34 [N=a8=4]VʗP*]t#];\XnՀ  iCWP;]J'F:BBZ]!`'`*;"(JJ)]`CWPjw"ƌtut$X7$BN`*3"\G:iq+ ]\9cЂ;]JF:BҊ9$"n8 µLQ;\F:0M5vsԎpC+B+YPEYOR~^/5DfYqb>UI~mJ10v٢ΫaLkvx<^GDw%M5jejjS-DM 7f#/?+2P+t|*5%%Ey֒5dMgXbX{0rBʂu*2W.DJf8ˇu¹dB8YJT;  aIv%Y1Ns&yg9gdj_lQYr)5ʅ¥,;M=l`yόhS}?Z |CW,۹":ͨJuiztJӿ'=2}zy 0ŗ~J~p_~h ^(=]sJE]pzq~p_|Z6NW#+')Z]Ml6CWK[B:]M.(]!](m8lH]MBWmCТtn&`Մ+)'ڔHW!iCt5v'\[+j1xtjN~3t׭Dk^]MAW+1M]M!n&BWD`ѕy`79ȉg..`G>{;%zW~ wF"ŧ1{^aqazُMs Y! ~t~`Ät6J&FJ(lgip̶d9^~;K ġW_~%t >pk^h{[%{Е(]=w1&nfڴpl&ZC2D#+g!`pBW@^it^;7DWp+hR+rmX ]Mn3t5st5QFV:B NDW] Uk$WW7ow?N .F՘dGlMlJ#Xf>$$gȶ$D` 'Wr!+ݛ [t{AnsΥ4Ww:7KM>0vV:{z`;ya5>LC@1wɦƌ`aBǐ]X.6Pom΄qh)5N({Ͻ#B)bS KduhFK<Qr9]+>%Or|{HX#ڊG @IΔB1#%IG֟KAh1`2cs"1G1o|nVRjQK&r8DE0S"^:mɶ1ڧ⌱YF#h# 7Z^ [ŬG]`bm$f3x:p2M6W%D!jmH fU cJ7 60/Z,J肸PV24ITiy],C833]ZK1ZzOG"LP83 *6{/PΎG⭐!B@8r[WUSYߙ yL'mG5j%e v"}KƏǧ^żM,Z V\Kе2BXF8;K˹;j"a7e)B-Td`(#(~a6(Xsv, mM3*Z.fcvl ABx*hVՌ!#XD?Gop JI"sQ:7s2XdY$ fZ&k %֔!P?Amj D,QDTDkzs`Qy8YU;L6( v8+dA A1+'jjjZ2Z ? vڳ>5M6Tf5OmG@i*h0/5{%`?BZ&B[퐄d$G}>͋3KP03/ [ Ce,c|O^yM/^esLrOi ]< hf=_{bMhSQo- Φ4QGCG,ɺQ3lFfh4v˨ lXEF,FHoH#Ԭ5n(!/{ÇSi樇*tyF0C{ߊ"Y0}E;8B l R KXq<"؀oݬ708Vhn"8$huSj z\,[RQ0bP. s,1@z*YJ#&FXzuk[s mJS\mqL/19==G+fn1ȀkլU5m(*yͤ &S@ ZRbD%]-kP2p\>@[h! >ѫ8AeӪ5-o(-Bi䋑Q+ׅiq#0]Q%Z{:o(=PCXm(utqH:ێ4JäeDB\5fs˥zpU[.4!b!X f=|TAH 8eBN V3f !Kk}Vi^]w$2( n^ryysn;];-ySrߨSJ6}^wCUTm=mNI*p1-i|O?ݽ_ܛS`wrY+hM5_\Onߜ|zu?y/i>/op77^_g s~b-m_fcݎqqweH:F'PI@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': NuMGܒhnC8'?lLQ@(ΪH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': tN {Syu;N '&>x'TZ:F'P^: N uH@R': N uH@R': N uH@R': N uH@R': N uH@R': N uH@J"o m f@@;2gZ{8_!7#A Gnd? 5E*$%o!փF"[fU5Uӵ+bLJZ T+j%P@VJZ T+j%P@VJZ T+j%P@VJZ T+j%P@VJZ T+j%P@VJZ T+j%P@S Ij!C64upSMmz}N_%zRe L&` %k,p {d.}KM=dTg-n9'!ZkNW[JPIUI֕r+Ygt(Uc+I1[]!`e+kH)thmA@ivJWCW-TAt*եЕBϕ%4ㆱ rAk ZKd Q #]VCWB/t(tute7Vo T.կx]R˯@goLeGU k`9|H ~ۯb4harVߑ eC!,ʪo?D8iWѸ(i}0w۵w9Ť ,jOAAI7L̻t ~srw #f2/Fa~,üHN:U,f|~ ">S@qZ `Q\^͛Vy^Ƕ뇚0},<= :#o7NJt)'tI=|=]^o,5!Vr\w %oNlԒk'Z&ÜL_<݊Y^\awt"-f0Mt`NN}*vOʍʜ.ITFb/ŤaNYl:/BF6bɖ4sgUc] fB?kf-Y3 nی&6IS !L9Y?}C - `Si sZ% - -03rJ8i1{kJW{eR #ղmL=B kj+t)th;]!J)*]!]1-aBJBWȾDY]`X1tpy1Ut(uJyx1tp-}+DT#+)9|Iמ,et(y #])|oI+D9 •XWV޺BZW:B-$.彏]!J!+]!]?DWXhR ]\M)th;]!JK*]#] TJ;$g{_V#|0R M#Zd5ˡiS(t`{h .'*CoJ޳݃hN=UEWRY ]!CgVԦS'~pi1th;]!ʭ̕&&<ƒRkJ^:|ԨJWGHWB fmAtǼB h?ZWHWRkTAt-'p%t(&! +N-4}+DjJ..B>eOz~(tute&y]¥B+DU.[2.OJ0_ɵW%ܗͿG[!g/䠬=eLNٯۧD:ǝl0ro'u;lI[u{x~ӹ}Bh~1VPhk^LhbB GB%5ń؎SN>`},?g J:%+]JWOz ff6B>`?JfQf++~:8~ ӋQJnSZLgt|ø*'>>ߠGC|I<]Gj#ݯOw[.lǀ ΄t;oW8MZ >e`7bx3.-$FZVNXQ- Nh+oֶd, |2u=%f*=RxpKbzR9比pWt :? 8u:~[+rg{^m-j<\._ >~~.ɷD>Z}ObZQ2Z#4(RF8)b}t(tut%$$Bb JQLоa:ϧ+@uutQDWXotEWWR jwBFT:BҔj +|Uްb wBVU:B2TrV3m9++}~hi"%i\GIW-gwSV )?%c+hݽߘghtÁb:سeu}˿~UHHtpׯ {~r5/g4ɖFTh 1ls)Fҿz~.0~ڜm?Vj77q[yطU3_/@7_5iw6\?ƫ!w8~>^_M+;|D.z?G罥B$2Lye^im0[=0 $ÉRVl=(rՂ^!kv ۯ^M&|,QK,6j[.|# |r:Z`C4~4qM^n/;O׳vw1e70_l0ī1ѫ]xwE+F!]]l(@\YZQ[3?Ȱh@غY~X< dME]xV le>4Hie䭖l:}@2&11/q՞n6i|m};dןxm05o=77x5Y4ҋL,:19C6?NNӆ1BҀ>YN77D>وE ?r#C?J(h1oKNQ%0]K?=[CZ 3=,m䮠 ~eN x?S=dt(?N.Kz@ VF?@+({QJأ ZvEB..Í[{v9>捗]F6܈\c b{'!qF[QDdLG*T9w8<$˨XDSYɤ̔dok=ݣjaxՍ_l9'.Th1Km ML%ODf#) 9M Ӆі=IQYcA;=X[%x)no|<.ʿ~\MBXΘ6kإYp4$gkk˳to7x2pB̨1f .X3 'i=.Ep:EEwJ2TǙ;[;.J3ʅc.4 _ o=|^\e|v7?b< 7OߍG}f 爓A%m.a=O kǬsੴRyE4˖[gI ,"JARe|@D)#ZC QZof)Z<F$<fBsx:PnjyuebiSyűbŇ4޺D)4DJ;L;ʇc>OQx loVCIp5Y@B b:~^OJ/vs~no;Hzfj%R``M$]IGZVmzWVJ7#1x7Z&Z ?Φqzξo l[]tlfJ!pǕ'/[N9 6FsS yTJpBf2X2 2eV1'Xr'j"Q2{ 1s]h.gMOO.XޯA/vX^s}ƽҗMK#wm|q$ 3NT`+hJF' >R0RRdXF)<0.Ql?ktq9_bO;j)HS@s$DЄZJ<.Kũє;[7YM{GC(ڳǸ;Ȏ.Hѧ"%_8{g ѩ ʺ8ŭxo2ΞݚƁ{;Ogp]Y/#R M.^@|fhٺi2vj~Gdꔸpؤ]3bc15coLC9 /54\HAH[ۛ}eur=Fm-f̶0+5o_2kf7;[kkBTsR,'|熱UW%7O~Zv~zEC_.g/GYT@cp2mx8W+o|N0V!(oܗ(./w䉼`TeYh &Zn 7([>$CMO 7w FEREM(\G*6_ɀ&ﭱJ7=~<R: Vo>\upwlK$#d01Skg{BcB*;2_{.&4Xo77KqI ڪ0Ggv|1kiM&M9l"Ֆ柸7t.Y1s'P]=05'͊gYq$ ˴5F ^K"@h3h r hC7s:Ê5U!Gtg x;.|TT1EJGC69mhb)B LAI;3 L!-+_̡6{&nq˪g|@''ij̐:ySЊP%KDBz^zpي S9-Ɂ~9S;؞N#ԗ= lrg9Ef%@AFCdDLTbR8ek38xH0ue ~<]@k[f $Nh +2 Fb4 D ҵSlWa5^Ng˸O?#{YNfg;uۮOdj)v! Uxɵ%J{Y}DHY`8ln%Ǯ۩/N\vN&V.TJwkl.MYh^I 'fAFDx$X\ Q3 UcE#DF;Fug] :6+nܓXNVw6NDVm /wN ל+bMRg' "BVJU$R:|ҠȅUwϹI.zdj%ϝugKmspAaA! pcs'>EQ~&(>ݕ&6Bxaa!Q9ITF^&@ϼa:;#LKC$KAtIFN:^WaӫQyxL[ ({U@uQKI o{X2h/* =M dZ>k mh+{78zwDv߹]fƵT]8&PXbPG߂A<QJOuI/z]xo{sq98X|8Quj="TGEqmJ 7Ukѻè%8}ݙdw2o7~o91N:S^}zZE[Ave#.i#Bv?rtQEKCBp\b>>ovֿn ~8۬;fk&Yя\7R({uI NՉ~í4=$=lwmYk_tt 'q2eIsU 'xј` &HŃK}lW10i-y B,n 'BktvDZ(ց_:w_~w&FG(̺ GRDD|Xo-nDž&X|gx@l=? F7ZcPI)L^I\ъģ0FNh9N3RKƀK%4yB[+%+b%#qT@hƴDi..@P30) '?a2V46nm%?>~O3IWoM[}5y'EDON9hap17|lPGxQG 9xS*ŋEFL5h d aPN9z甇#7¹^Mg(Ϗ`h_h,G>#U^@BݍW8pZ9GoO3 hVe b1;#PUq*1R=Qy1^%NNgǫJc@:; < ο~k`It9 qz LN0* 6~ȶVJBE9%ϷIȟFPS1=Ÿv6GW(GĿ F" .ΐOyo_/N)e?^~j PbA'+}z iy?OY0|f-T bD+TM<θJ3gI+UMFTĐ_:9?+7}}p*x8Oa?=Rez9uGߟarCR3䋪gpYnUT@YY.ug*__68_6/T/d PxTu:&8\j}xĆR0J9+],xgW=+\z@WA.6ڨ#d2EҞEDY,a}NLBA~"2O!KjD=2:pBFD;㔕@Tp ڿʹ"fۥބy"j-I& *A(7AP!T 2P$\:@VYVTdYٿ[#(;Ϙ1(SFڄPbdGaa4r# 9pNVFJxRƠb#qPy8;O$g;XEۨii]]o\7+Fv=$RWmbbm/ }&nLK͌qs&>sPG"""9 U ԨāZ+WؑU bɸޞ l^2ѳ(_%wMHuٙ(k6g"{#މKbrIS艝 ]ԣz\!9BY;K ~],;sk!7$ [-H)d?&Z-n0Xk{X ցр AC7S.,\,,l,ckuTRoPZ\$q#}Rl" parҲx,ócFrѥ =r㢻3g{w?]6gȟ?+|W7  gLޕ czM\Ij-3wcys_\A@X 2^Uɗ|r$bgMTuf* ђĉc̾crF_:ŌdXsh8LJ|-Ԁ:ތ/'6|>J;kcS@t\vu~vf_o-}bO>^cd_u z]R(K欹PVLuR.ʻr6j4dž[r4@72w)p Gْ ƺٻY/}TVF V!ʏuQ MvjDU@#!_8 FCZ:\0IZWZ{ł(Jj6`kETqDI߅^,pB -PWN9 -k~ΧnEdYa >&' l1+KZԭWKW1;S `}  уvX'[VdF SdL6xS4uA\m 9Һ(F~>7zSv7Ǘ+!>`uu}ݕ>/Y @>+ao2dr#E s9I|$#YvrdSW NmZ;\`6 \ciDP-ABQTBn|]Y߀_k_x>£ǯ<=y8ꟜпʹxB'Y6K8ZpT-Еʼn|Oo[om)ˋ }5VhCxTq3DPcrGڭ%ҷ^Zƨ !MQb,{kfw%x/q^[1on!V 6FH5gt!1֖UkIY02Ce#N0*)(KRʹHUDqD&q8U ?OӤ,cROkIajG)fVLXUq:El6^MIP0)S(-o_L;3*͸Y\qbT8eEJA59Q<%B Bt*O k͍qE|,<@(rg~s8D}@EM`oȻKP^-RZe"X X'E1WM\Ob7W`X93h2WF1W"6x(J/}J&s͕1W"դ])am2W/\9`sP":vs=tդ;MJ7Ds+ᠫ&9UaIiPK4W$o+3ĵs2ؤǮDJd^bֲ/ˌݬ[-aՠkg 3S[ o~.[/ 5Vk_~{q/4?>Y T-,xuNM?ײ@Y,db>$8;=͏;/ðJZ"&'Nr<,?2`x-4DӵWDVg#D@-+AiOތߪJWP+;~ZԞ+PM1{C"26.˔E}&W>;;)Wo`gѰlD -nܴzvްKCU  T9ԜZffn׳lkA^Kc0),vʚD_hY^RfyY~pF q}GB'.GI,I#;gG+նS%\&b'-\ivzX2W"0ᠫ&C1WMZnl&sCe?=9{צ,+WGb֖\\M~8+6^ٛۡ_ ɒ*2 Υ!"0j< W;k]9?NhG_5"XD[!:ZU։Lmd)SP io1 =b^ on1bAgg/aiw6XN^١}%Fw~^km0n'e5k=K2ED^}xvY=KN㻵9yH}gڇD槶&OCkFJ^w{S_uV7qxOܭJG-k3D^2V,]|8ۅQݠ,4?ȅ;"+\ KV &N +C2zcg]оPܲ W6ؤb `Wƒ-Br9r {-Ɂ/瑯|}j_cfo N}#i84m;iMG4mRt$"^*q6ܩ,VmRvuCX9g?w7nD Uno ZL*h cNNa-x%gbFrdJ TM_w}r=.bB+éQ( EV kňy(Rl蘚..Md9"ZÄz-ý!*)>bS#pޗDњں")ʊQ6s.RNOMs۱oF'={"gUP*i1U&sPv9*++k[ceUςXiYEhWa#ev)j U*œs 8Գ^ U= U.oKѴn/e,>`іa K1F{I|7uT \~+e`X) **[Juv$MpI8:+]QkM&yZM[*bAjT J-+*LPd\oOGDDOEdgcQnKd[峝;ԣfgbL^kgx'.ɁTc@!zb'H&`lHgE;mufmcMM<>Ar˰eXrH|)7nOq'\M|rgon0X|kk:p d7Ct( f#kI[%lE;<*7۸!$/ޠ8HF*&E&e']]1 :)XqQ: xܭSdNQ0Urm>"Ю8O2H#TF-vEWg.NY/P3>wEvZ|֋6n::B}Yot (!J%*|޶ kr~`K^gHPCoJr9RNb`] r\Dk+dBVY`&񕏛Q7E]Ƃ"I^XXZpCrz4ڞY\QGG7[^zlr醰T6#ſU~+?7C{ҍyؐ|˟[1._gKz-V\*_x׶OPoOPO[gŽm͆p"\ xrjwuۀ``C43"vu__٢_c[Ŕ&\%1V֭G5v~ؐCxXMF%5Q&F/w^3_zSkCwLV~8҂ffE=''c8,wdzşosL5qك/)ǁNC0C@A |+$Ad}`;d!=Ŗd[K5-{zlzY/lzMY:^ህ%HuLE+q} 8aX+VI XAJh3P"&/O u.M{х1"Z1!s*y l¢h6R%.Q񶱂4 吔y IPEJRLt D2+YH *꜔ CYyzY&ayB9&Dr y:l$![fd"[HkBo[&x^=cDm rh^~mڞ6ic@͇n{|H r ;HYp<Ē>o,iR%,ĢE$c<@^(bnN} ~SN#)ƔE$@4^fra@+ A"H:`rZ`d隉0aLT5Đ_kv-3W| Bm㳧>Y_mMKuCkD6yQxr2 <{,2M.'LO"eP&ʻ`o${6վ&+&l `T+,v%(~;7dt#Br,Jp䪔Z~Brzuat>;ӅS6tq)#Ǯd¨Mx}i.W4P<.u-/])$'t٠Dd-;4ѝM;>$;0# yͦ]]rP]loo7q_%ELNg!#Iqz lw>}m@?UN#vADi6Ȥ/hOV4Xd6 MT!|( ɜp)\4F(Z}4(&-=زuE**Yq?‡P6'sB!v/|}t19Zk/pn}# 7.\Ύmn<%z3QDKB&CVhy썵)ykh %P &jpD <w&X(:Qu7 ZXJ4a g=dzWދկz`=)zr]1XdY2ĉIWݭ"\8 < *ŏg嬁mitԩ"LgtGQC=ٖ ۶4l˝FlON-sQZsDGyj`~U@DT!4"f)lWn[ݶpۗ'rku/ teiT3Ţ@S˭ BIc,QFI8C:SyWmkuz9?(mO׹1qu]n?k-Ue2خcp~$6XњYC::tTQ^Lܙ93C3$*mHx6jSL=ҹd)aD,:6gMeذ ODk#dYLK{2pƂJBNJ>i%,uN/Ƨ'P3ʘ< hvZllYlMfKeU*IUgk95!%[|CVvI ڀBES@suHPV2~JS$Ϭ*(eh Mf(_61!rz6ZuUּRLru3$('w%*P IW3<@%zuIte}MկlN(ȺL E6QE%ёl2:v@:v͐KnnUw7l ہϴj[ZtޟuRx㇎g<>'^Z, k%NkM h;6KWBIQWU9~u"]?7|PC>K A{/eGY "!sPNmB RV< 5Ԑ_3^A!g ZhW24}L뙘vƵ uvyʷt "x SM]2k50Mƕ2z|~)֞jC~a>"rl^Hz2)U@4jc0նd9l(/)bo.2?;l 0\lV/nIc}~>|3;? yz0.Qt-: k*;YUӇ(~5e׷ti.m-FѦVu|]jzm=?bVڬZuu7o{^~zBG[vܲ[4| aVƨoyw;o3߶;At|?aKwz!\鶇k4f[mΎOb`}F[{En3Ҝ˅~燞v ܧ9s,)9tvWgL-9A7 9_,@)h( PQ3*i UD!+k#P(Z-5}H;_w.ZwBpC!25xHgIay:TlF8T-IP0f(^ o)ĕXIҠ7$rNXQL7aZ'wLpyc6_2mn/*kJST"G2R #/ּJd*V5K@uxI'  cBSdޕ xhzrk$J!'_ .'+R A526gxf\b!6B?`^p4gvY9[n@OL"?oЅӓ#@x%JF-b"@y )Da6BP͔n"PjTHiUaSJȚ59 -<eTb.":ߺO+qFl[&桠vٱ#j =053^M5'Ou :Nv;eB{C`%c+`]CfҰuv5)hUV>Z&Q|+Dd::|3qި_&x(l~슈1"DX!,YTAE/33"4&`$Z 2HmsLBXg6εZq")urF3(zmcdKZ&L4aDtSttH{fɮpqō%BRbEP%,JTL0J4^pPuc<Ի {8?d7GX7MNp3y?Z@|1sy7z?9'~|U]T_>yF7𕿆/  _|6|.eep9#p|ۿҘJ ].)7yzKff_Qսt`EkП{i_ '@ЇJC)}hx|a5a2Pd"ᕪdže{4e~Čk?P˻93tv>/C/.'<Ivvijqgb2b\,W/)8G[jY *q?h"vѹ 9wʬ}xIcZvXul(:h[b`$%/QK- ^(l`nr E鐊, ԏ\jbl9qEƃO!=ݶgZ)YRVZ$ɖ:)#"JJJ#_V{A:x`sjG䔓( ;V1V)ؘ+6\!&F?nsJtH ^M-vnp?}>1XgB"֘XT p)+J BFɒ!K xƓqZ!)85haB/LY)FDւ)"fF9yZ pZ;89^{Ac-DzIQZB!` A;ʹ mʺJ-qd!EZ[v'| و/Fv7IB_9SHHJ FDZ13#dkb+YPL Z;0'mcp֟rL{)kY5&_On*5ˣoѲ^bbT8돑wH(8(Ͳp/UU^8kszǣ;UY3z{OLу>7e(t4vY9<{k'ScF Rn"YF({ ` kf2ڵ=x{|HGx6Yvo!hc{ٻFnd* 6mx`7;L &-,y%y&ރ}S,VK`fl5EU*c: 9Z8Γ” fyp?I}C\P<<0u^)J[N}54ns?jQƂjW,l@x^/rJM5D*lQpI^-E,SgER[UꍆĆ o&΢S*(N+G//]2>& .7*MH!Nx\]0}P^#O?AR 3EuIҖ w[\Pg` LbTTaXa{;Ĵfy9l"͎]0tfP&Rͦvۏ菱2U;rZp=a,QA%(KfT"ݸj5>6F.gNϝV;J $J m%k/ݢir+s+NLcufnak~[lv7;=ktnFBC6hHI3 \b ?5 ?TS Bh/[db|9x޶8MŇXTH,y4UU@ny҆ o%"Z(nvcP9@$bצ//Pj!dHģѤ5N[;Ap\@a.DB/ӛSmM; I%:9'D3X0<:d!+XШYF ]AHf8.) ڂ1DQBr)\B8>~~R|g9{>w +DrΎq?qf[F ta1I*-fNsh^n;^bfW[Loc)o|m -f2jss6a\Aڜrø=z; hz 8Ҭ.T/ Z h)}>KqT]ϻG+m(~;$d՛^}n~?ԗL:v 7ɛ&&蟷Kj7zJa׷-CbjPlVګT/RI*U'`Jer5JL1]/UT@_S\)%HbqpUP ]L%^!\iWOW\s0pjʺWJzzpe%XNv&iqy1gr sRY1~+6~aׂN+FOo(&Y4j%L&/G)crmP,Lg<*~(0՝73+PӇbW/, }i#qԾQTCwJ<DWz6:  \er5=BjЮUk>> \QR+$XQu0pp(p׮2p qWquVqN-aEg9Pewu~ՙժ3g.e2$, Z"D3Jyr=yqhy_od.tMAk*,>C8l"/\Kj[~k;#"@ b4RKsց BpVFZ)  RJh?@z=5yC'W㼤fgs52٭\Ց\{Ҳ_ĈOV/6F)Eki01ǁs8@ ,YìN*Ge˗Z#:5:at2Gz -(@aB[/g )kkF !oBȏnNwUsBi%ήdMfZ^tTֽ&qhys4UzY|bQ Wj_rXɩ4N: b"B55`R NF΁r x фe`h0ru Lsܜw!Si;繻U؁-'?i(疘aR }(S?9HmyjTE]DTF% %t,O;3.B|4BO<6a2X0%/AE(E6IS'-jFчyZA[?|z4C:8~a8x&_NF0 SR$q(I="KꞞ"RwI*NRߙ/зUh".CCuNknoqNvPYn>O4$.Zx@'ڈuʨ7LJ,MV^wɤ;wNY Jb eҔߏQ4&9'E~Sf:\z8_$@b8TʷCU; ˛ɑxdxdQ9b8+w%3K1LY< O~#JچRʓY*n͹i:AN:C,܏=fu')a4>7sya=5od4ŭ%zYF|K%"r69;= OѲu|%pCfꐸp.Sbci5Ϙ0}Cސ5s4 <}eur"κEEYݝΕ0h]q,'+{ӷL7x[5:ȟyV;Zhk? z1&Xs4P(J)rDP_BhUPHBWwϹI.zj%kFκeX&%e$Ef^ p pcQr'>EQ&( >Fk`_x`!V5IJMrR ytFH 3:9,蒌T;o_YQN/G}1mY0Gԣ7Uā4ۢD-* (d^Uz#@=}A7V n' J <x}DqI9pV|2/B9,kb[v`@fLv"-څ֘.T]&Ld( Dh,@{Ђo[ĠDɨh:sVwWҋ|\K]m_Oa#xe7b86 OaTZ&#ZUAQ\҂xuoѻ%8}ݞn=_۫ΐ~8tt ji=RL(0 a:W(M!!JTe 1o YY;߃. ?ˬ;Xfk&Yя\7R({uI NՉ~å6ݧ#lm,W.)@N"eD 62)O7ǔW{uhq#0C \ܻ l߁5˗[9rף`Z)eXx/i 98Fys ?sW>T [>_ƀ|]5͛FlӴiEt-~w]4{ln8 1m^.H;\^t}ݿ]wÖa_r> U } l60Qt3de2XEkGz`\}g(dR{{oKIK EtIJp ;@G.it:Y/f®Ď̄]0}#ezҋL髰Oy'Hn)^A: 21`$VKQc`p=g+}zH gCߚ }{Nqҟ[LiRYz0Hur`*@h TRbK/-szyTp͉췞uJg9PBgeNj Qp6()N*ɭ5nvݜqjHnI@vF֧Awmf>_^9/HUSA=mؒmz6i&a< @UQvl7~|Qi˗|yo.g}:lHigTgӋ ܫGߟ^E{R\\T=Scr+h%*J*;u; +Fv/jP05ht2s$*lmxA^ Ws"y5{?Շ_VƵ~ԜRj~5r.~1Mk{ ~)/EBk_OBXԵ=<(p6/ O9 ϝ?=jV!3g8NK7D5R G74jDdR8K<+IŸ^_TKCڧ{7^ UĄ1ԬԂbLGZ:I]^"$;̈́ws">^zIPDRBHZcLH{34RuLZhrzyśҵ0ո?ǻy{[o{Oi<5I-'sS)Ų;\V&8Ӈ1j}i `4sVč3NyWYp+ # 6Ĉ74YCtpg'z{&|=ϴ"fߥWޅy"FHI. ArKhU*W(UldDT+гڻ"j,%?k)bQjY"2&#s> Q gTu*'i \E29X[B,cj/Ne?"*}9ѝ}m]p|Isg&><\l! ѷ5JkKO&7FA$8#k66Iy0o%Q%w{-;ßrv6R wLj<ur)PtS™B Rm S9s$9g3 KY@XF 6JyYޱu֝-lH6LTyDE:gFX潄"e.00REF}7:"Х|\O=-H Q(!@m+%&46= jٰz? ߇?]]o\9r+kEY%``'$>$/%mK|,Sl[d+{HŪ\`,\Jyg4X$ՆhĀ ff:dcՖg8&s4x%y8V65lH%,h tpРU RN\v&bnlKm="ep.'LTh"Ň#/uBGbL^\3gArLm?4P;ti^-26ӳ_:FǮ7pr.n8EdX6VM^Cn"P$&; 6A&OmOi@p1FkkhV AD`9|)Y.Ke0/¼X c:%RIxb\"(,(. )'-;U{;u e wRHeuެJXtd_s>*eک&埐uQB8(|k&oz*iB*i*W罨~;#KjmJkX4$ʮR`-F+UL`xZ/y]h^gs+NwSUL`|!}-fet]U>m1;( }SA v=Ģ7IgsV?~2jEX媋el5*OIQUa;h!V$㪟ϖnpHף); ķ^vqpU f\Pɷd Fb'B Yr&H6tf͕s$Lk1-ta(peH@"19dQդƎ +V=E+b|^kvbiQ~Ǯ-XNimtGEd`Gmp"zنR\Dv:mdA[ +_]odk9i|t~SC;QKcu4|Cz$H )礌ʞ=go,&DPe%"Qt\J& z/e*e#<(F&Ec\ًUc =Vk6RZ#|*cH՜A ĶT"TZWǜIL4M7#NCbJ.%RʹHU )%L>Jn2mr_OҼXX<9b^,=V< FUaY=;ԪJB2% byŲ{,ü࢙vV:Ҹz]mTVCdTZlFy(Y%ZSyVFQZqALaytq1<@ Rw6C;F{f>G9(99WVn\M#4S?Gh(k#nujhitP2tt6v pӠX(Q?nhKM򪇢^M>x͡2促?Ϗ?,N/fMQ38hjXQvDа7O79-_.UKq?8'9<&$Rdb~>8=)w&GÉG|FHLi/ KfHoo3w,}RHYkW9Ϯ^@"'aM+AOKkK?_KeꕲQDg=W5c`CCjN`򦎿+Ybeٶ&;\,ΎS$%I\)"*AxG :h*VmLI ng&Vz9{)ơBh$A`>i yǘGhYZ,aqmlRYe'JAKɠe瑱䙮XYѬzU&(st[A}u~[k\O>;U?9ۡ$7io%ͻ7 4Lu;mvWyMC[ЕS숮`VЕ5{MJ3]!]Z7$F;Ǣ+ VBW*+;bE Yk;l7t녮-kW %>ҕud:+l5 ]5NW 3]!]C]5;O\ߍ瑱e7uVLW{HWΑJ{d}noxt>'tP{]5U뺡v %LW{HW-MVjpz骡YKb zUQ׆f2%ꁳ~ S/?_V8+ySeIF=kvwle]+f@h˚;onoO=7q-4܍'T:=sa1ǵ@zz^6#t`]lZ]mG[tةsGt%U\/t2Nv=+ +h͹jpw]`;S>ҕaC;+wCW UCk&o 6'3]]Y؎&솮\Еz`C 8ҕS'pa~Zy ]5ԍ]UC1t啵:+\lhwuk+K3]!]#4"]5.вZ;ywtJͺ,ß_A{swcT(#cvN~6h)]Y-Wv.m!?}6́džn ÿ-\{j[?]|O7Ap,e@Lksp懃[~s.UÃoPOҵwkjVMx+Wo[m]mdd97COS\w"}Y@kRsmپ}-p~qaY7ΰsuˮʢSytopůw<d2S+wM7Kz-]j࣬5q\ \(~V+¸eW]bC^?K*[0 eYB$(?07wG p#8>bln)̯@1!>KG79D߿ @΅ ei:.[5¿ 0r `R R-ڑu]d댊EGtrX[D= o叿^ gwhѹlo%  otC&t1W68A'XW nU'ltuPhFeuV6Qϧ=C j*Fe^ꐌQ!+}1CYUv!=2wl`8VR֗5gx; G$}VD ( DK ),dbN2-}զfj lct> -Va[AN?9.Y1ڢӔ*DѤZEEP̲&E{tPLcpZYt̊ɱ jQ4:mIb]{ h-l~]]ً@Y{oE`D t(c2U0JVC|b*2ѲЄ\+`>;LJăHiD  3JhY}$iO8rˊiWc5>#() ElD 1r 9 VClZ"5ȲW"ZJ( 6A j"b C$CuH*мGwU+czh2&236I+\CmŌUD1c**&6/ I; ')Q"_ 0bÀmrq{}Ӯ֖);'S4~kW F=Fu&>Hh"X # o b9PT8xi?olJ?n:@CVLAG=$]I4T*#d`&SQd~Z7G:#.h{ 4I"i Qk2!`(`Ghx L}h$kIu 5<o 7z ,:T% 9ՠhKP-"V1hFy¶e@TD~*Bwy4SU)VXB(;KSv5H!/Z@mD%|uj З9ttIճD(ʠvPJ5K q[O κ$!H;V@@]BZx _3>D[&$˽e;VE̓A;J"!ͱk,B$ fj2RAٕ P?AjDPqkUGP2,,HcF8&gو.BPˉ6BkB#`%͐eќ4MUJPD[vTjj5]ZVE*,e4G-w$a:J` ئQ}AoAKCj)M.ڹy6W]ϧ X2]L۴˶\GI{l`0CdCwH7ۭIFOB=V`&cӿB("͢qնYSQt5(U/ yHY=yh4vM Ƥ=EB #my9T2j7 JȀ-ɡmM1WdsC<܈ܢYf1uA;(XfhTYڂDeƮXBD4cҠNn )DyE aLw(ƈMnƢb$ӽw:Hc[qOҕȪT~Qc@sjo6i]0r0݁ b %_T4/Q4T Z،P-EC9yeǝ v)D7lajlJ!@6(-VT@9 Z6i/zBL M F:`G򨵧BSp*qKZkCWQ[1ixxpQi!XwԘM5Uti J.ȎYxЬ ބ$CtO](H_PFU~!(L ΃rt `j?-_^ZܼZl-ǥ\)'~2M%ht7a{?~* LdBSaL]2z76X|yvU)W 3 巫%o ~ym77϶}?,ܟcju}3yw|Z*&G.#Dk:hYs)L7׻]\.~".wnί߷O׸˛z >:Zh@&e~>UaN `%y,Ua'5wʝe' 9 @b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; NuiHY~'[= & 7=I'u; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@ȡjO '8׹a@$ h;N ; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@ #9؄q@k0N fN @'"! N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@:!'G{Y[U׷w V7eu{yt9=qG5q pv;37.Zn\";6.}ƥu= g+R CWWQ7w"N1] ]\" CWQz5w"0]"]Cޜ_Pkuy:߭h_o/jSq,X]"P_=x{x}?#r=꺜~?^} bx \|[Qn/ /᷻qBt}/ ))P'cr}tۥֽW HBѸ׫vሆCӦS`3]NVWV~MR3\QL'{{F s+Bi% ҕ3]pj"z"N̝e`:E9'| CW7Q h"\ $]wT`ǡ+뇡+bsWRY+'n "abF= ]ftE(NXqCf_ JN- i0+rSR)(a~K#ʞEm1ݾ\*T45f&Nϝ LӧHRN*o:kJ6ܾDě7ޛp`٠Vo/}/LX&K~ûtSW~ z&{\=_/]ɭM439Ub ͙δͱ^jg/6(@~~)?ᅲm_1}U]758//qtqQX D (gi1ݨtw!}lOSΧkee-U>X.ǐQ%r7$LP(!;z1Gsޓ|+PynSm?^wOLLton)\.e5x49br )dms)T.5j~ fj_˦}a˴eoy;tcj{H_!|K%܇"|]#ꮶHS=$%&)nɴ=0, fMOUTwWWF ;[oSF%ov6߽`W`yyϧ`d:\xWRwO`$6D:¢h l!lKp 8O5 w`,d,>X^|¬+$EAN訋fU3@ѩ)+<̙cM& Fg?ݵQ.bJʻ~V{|}ucF>$ ÖbP-`B\@dLjb1Љ<&Sj+|;vYW)lڽoUK3m \Xsx-^l _6fms}jpO?,?n}m/Q?~) ͰgkFdR)6 &@x9j qrزjq˦ףXz %%|jnv7)d\q{7+FWk |?:GZna2't]zʂ2dUM6<0\XIb !jx/6 yxݓhGD<6""B'EO/ }/ ie@P2ZgSѢ8Q3:@ՙL1R^ui~SL#)ڥ"5g *ʘ .YPPJIFBR1M3q4v`/՘|8gW8%na~2_n|WHq߲4|R [b-Իwȳb|K}IAP :F&<(WzIEii" 4z1H! L`Mw޾e~_V&glv؞JF{kEs~p݄..= &A.5whdIBym:6 kʼn#LN=+8pf(Ř셔^DFH!Ȩ-EpT⁎M1V*-$(֢/>b+WI$QB* k#6q{ -bY/uuK#u.!g_:+9 WӞ!~钧5'{K3bvS 3zA~sg ?k&OEI*(|H2ʂ2u!POe!-33[QԓL "d^"bcL#79Rw=;a;nv֥]#;Y^l _߾"A{! )!:VZG8sP ɤd\9J &1ԂS`@;I@hIb IB W&$-8[N|6[}}5q^10˯, vmmެܭ_w`5&fPɤ.*W eM̝*tƢ0i FiXx&niysK΍[4b-O=bRCԒU@)zL>U[W*C D"VTR$B{ιsw_nojd//ML<ƈb LEY4LP:!4Nɒd 7 \P&{Pfȱ[-cDuZk;Se98Qe_zl|?'7nb28V{{{ۙءpOh{ۻFfhkux=y"fMϐhk๎QmJi :)0CY47JuN&r(}&p/%4|k9lwqO_/it_]"n&Hɬyᒦ%S2_FA ԿկNMg{2V؁1Xy&3\)YW>|aɐrz?ۮ-;ڂoHғAʓ'iDZH_ )jn&|bol]:cqSR GeϧhIEt]*Dу9c]uR[+mSt7E0HH5QCRsE%aZo&Α`QJ,0ҺDASfPlT,QN+ڝ)[)U',;<ӿZi:4ͮ?M [ Oϳ1O:5ɢlv_,i3AfK NN}~˾Y2_(<RgםQ.vcB*+LezWr~cl`g z3ZT8?zcuS\ヾ\vп={@j]U.a*4uWpQ+[!7b7RBz8*^#vR|mUi Ϋ+b,[k}UPfk+´GRyJ7?yR]>crz6G|uЯ2\<Wv gsGtx7[|\,ǃ{Ur!;ud[ƿwH׎hn4]C5 ln1O2:nqknp<#޵9%|2ãw-;n٭n g=7~u#zjyXm^ctqin}u#/y}Ts.$9L6/y"eAwOoaHۡH~Hv[EV ' 5؟$n`QSiRzq^5jɯ^Y3/ @^)Lz<E"=%#NQ:ST"Czts;jf53^`YWJI@JDQIW<6q+NF*k^=.xlOuyv΂($)Sj!qk(s\$0ѡtl%&%hW&PI&2F╶:#ڌދj3q7Tz[iT7gMOPXwҬqr?}J+q)$4&jLgt xhP )XȈ*POIAJkjsfX/l>/|xO#i}t0>LƋw&P\\"ɑfHDhcELJF wLAq+TZU]SL< -B,*RITDuIVWu Gg+EO'y>7UtڀܧPA@OJFB~A`EB$OSq?]H}މ;%s9'X5!/Ք% S@!'DЄZJ<.IũєVֲj WzZMG!z(jZ cܜȆRB<3䡻$.Zx@%\ VFk$ &K"xaUx[m\-Sx11E aYct}Rŷ t9,}? c1Es]]T*T!\V[d*|<# ~7ΜBXȇF͵}}B)G fcJ-p)P?(cq p8ЇX i>}( ⩵q<~TR $WQ^.ضݒo.Z~/4?$.!#N3bmi%}o}R?Η&k4 ]vz t}ׯ~f1=W, ^YP&m so|ꕡ*r_uO#6n9(=.zS_*ECgVC굊Ěe7) G8q%]2U28"cȍϾ/OJ-2qlL}u౼?~,kfpl$LT 뷟n{:]އ4X7O1Ipiڨ0 3q3|1zLId7,0ăN9=фlktD{~㹩8wjhpe %  gxiRN!Ŝ%eK]?ۦjhOC_  \A{PQiDŽ9+1٘2納2*X2%n8zytZVDS76=njC}מEjKS$61r6 ЛITn^(qK)Gsǜͪ.obM%|k9Y^1-pә}r `)$HKTebȤ>W-Nj?Ap+:^(Cj7,jb10i-y B+ CX!5wcp_5lqjAe@]կϬ,3RH@=ޠJkqS\JM"$2H!vYzT[FT)x-WR2b*V$ h$h #5H"Na MGnخ)GEHьi0\Ɓ&jU_8*a[e=Vٔ~8އa^[Õd´"VZ\qD6@'ABe'.lPxQcmD$ bv>Eq@ $C%vpANpc8%rro/hY )X|Fr!qR 2#7^@Ie5Ꮺd|?tۅ+OUbz3z{:::<8T29A!lp;@"Х7WO{Ƅ>.nhtQ?Mkke›y`vE8+bݧ {gr{3sƑb0pׂ͙̆U-- UͰo`Xɧaxw=MΙ՝cd}NjuU_)CH8ku0ܰpQ,TdauHh1 ~ѹq Ŀ".p9ߝ|櫷_<_ޞ|ɛ9yKq#p *x4 ?oN7[90r~\pA7zz:T³ FnQO^Ao]7B̝ψM5MFlҴnדeA4~.n{ln818@grd5\^4_ՏncHtѥH,٘qCH\`-E)8ÙKV&UDt66LKYqϟm48وIA#0I.%-q 9!s;&)1$B\ $ߝ4r.'ĖNFM8M1';wۡ|BXչ<ێVvCn}RpK(1IL80Xⰴqy[$~d~A&u@ std9e@};e( Dh,@{Ђ6T<gj=tv A{lr7sa|< `P#8(o烄,]}s`|ɴ/!o4XII3orDɭpUc?قbDNK ?mv 1R6_iD0 'CK&B$RzH:$(bxڍ$U'ܽ)z0ZJBي, |ugZ0Fچbݗ[ݝ/a'CLMh\(GuhR<&<2bV:.yR𠢷:^.I|9|hd٤:@۩0ϣPb8#0Ra6n 2DJr[*$+U wwS+-/"pr-%Q,Jw1+bP${ۄPbdGaa4r#yHh'{ Ef%ǝ%H*u4XI@N? Y4NXND)@⾊ENP+%&46j j~'uz҂3u[C(R""$rMNF$9ZAD*iI+ VP4&>`oMK$Hau(ɥ RVJ.}a(S䙜M J5s6T;ɃyRsgqgQ3U9ljok5WׯjW?╷n*Pp>]?` qUo'=ܹݏO}8j-SQič\"_T)‘J!کo3>zƫڛPGㅌJ<zxoNPzy6,_L}M,Qg:*~/UoEQhr秪fDMuI\j4(]Q8S ^a%kYo z91RZkT7@ǵ D*3+H@ʺhDŽJlmMCr{t϶X[- ڶP6% +nZI3P9 ^eYDwl?H-:Xi&ؚ­T~SA^Fyj,jٲX9v3)s-vp0GoԗCFz4AH[$ rPIbK/-smL>2+iK@ -eNj Qp6)Q a U[k4O|,EwWgnf5['zg71#y&+[ےf2loz$Jֻ̫m\> Jȳp]҉`UɌpėyV? :T//d/ c>>DK-(!$|d xaIrK$٨KInDYIPDRBHZcLH{Vk{coZBm~mRKk;^qe^\#hƜ6>ۻD/\J5x-;LGQa?bYW]VsN~>Fm/-qJq) HABA۝Ur-NJȃPmTohR{A"i"OJy,U0 xə61l|M?IU錧@}2@ Xg\Fg}9W8e%#O C -Blg5g!fzSP[q5{jG] kOdX ]oڎ7b:+#$joU&}WH-e*S)h'^ƈsRM{1v臟f)yI^ TQQ8$c1U\~͉k tΝcal{Ƒ[.h t@2yr851rw޽҂ihZ 2@jlz|dGS@O:'zD?~qKaOekf(p%{JNH H3cW\G8(Hs rpgWR WG+]A!3 vdW(+R+F]J5uWR+ HWٱJj"R+RkF?v*-cWksʮPtIjo~*Ƕㄫ'cN^n6?3Θ:4[g(&rAl0MrM6Y%LӤOcvӪ)Rw{ ~}~r}A?WT5.\Rv5=X#4 Hc 6v\ʱ7Ipe+X6B\pEj;HW8Mn8\`ɳU H&pSi섫#ĕt LNAm>5,\Z> SvuҌqSg|Q`"\Z1Tj1q=A's  .=ijQ/n1Jim3G (js]^ή݁C'Mo>pOa-+r-q=J!'cUjQC)+oK,S %*ۓ?^v ?춰8o|ߟ-~lb?f/'V>xWU:X%#L^T>|wEH|9\7vX銰1@Bwn?VT#U^.P'Eѵ܂E<- ,򢸋VܙpT+X[ WiN0Fg[qTNX N7aO z=j }L];m''unW_ VojOۺZΊ_ҲgE@&]Q#P6gXSqqX*tM;2 S6FU>GU.|l4ecǓYAqfc}(#>/}XGebPⅥ^/bPnymVAM5)G kX+m?:ւ/0(?EA[ǣ=n-x{u9˩nQ-uyh>~~4$cnWM5 sg7_MbbgŬ**?Yg lZu(r;xf20_~aIPϻC*oȾvA cOjx z/Ko{ុgOCŋ^ڸizo}KM:\n]ӥs 1L c@2ʹ25OHv+˻V r*CɝΧ$^5:38q[ 0u:1OC9E?Q_Cg%Q5A&Hb \3?HsK+H"w + udt:9G4$_tVehݳl1d)[p҇H{V.|pu<,qOr9Z_rx8k^ Mo1cޔV!c^̮(Ɲi\7˛\RdE}]w~%h-U:ezmiDya}總k+Q ŧ@#5x͋b5~wB/_F;6zL U>Ff2 F!87Vce|zV"ni!a(Xm*u &Ԫ!e9̟wo-۩s2*>^LRJԖfkغOlrjP퇚Krm6[Z{}b=M}6zuGSNKak\8~j: #[ {JOpEu>"&\Z'ǎ+؄c'GphH/G!\ZƎ+RĄ#ĕB:(X* HֹZ=v\J#ĕZYP:\\r=ԶT\ WO+E.pEW$\pj~Tr=qlkO]Z= ۡ!WF(9DOqPjf]Z97OۄVƮHklcW Lj+ǜo6֒h*lv]bՉ>o,^^{rUrl{6_V@;qT$yn}]u1gB1}J[9%BQn4w5/$؉lktR gԼ7Lbjqyr;ql\u'R26SL\ Wv2 62\\ @v5Tpu92W3K.zB?JWrlor'\= F]`ʵ H=Hekח Wǃ+ONcW$3vr5sr;v\J1eWLj+zY kj H}v*{>M:\i͵e ?Eu(\\sARqezڱ}qEm>uO(m/\i\J5Md8F\Yi Ʌl&2Z1 RNQY:e`*W{6O20tG-X9o9_ 氳^n:R'#uzYk3Yvg_2hl̙сul'w`=mh3ጆPd\ -Z9̤Rih -؎(59 ~U?O~,섫]Mr   .\Z=z\J'\!pEm>BX."Ǝ+R儫#ĕҰ:*"\\+کF+Riń#ĕXpwO% P3v\J;uWʁHlpEr{Z/TrpuN+, W$fDw#W WG+9$. # J=]#6=z n;H\pEjqE*͔]%A)`wNn#X28rsXLS S?ndv=0&Ljz`P%lpEj;PPbJe+e?>Z9T*7qYؕB|prۙqjcTO:B\in ~=,\Ziǎ+R섫#ĕf8d+,+˙WV;PdWo0=*6[\GԎ^Jk^_\PD]kEk0~}ao߾];}u|/so.%:XZ]SsyV--P'|Uml#KC7|Hw7X.t 7b9l.4Z8¿/m.\24Yڇ-OlwZZ/|a - n[1Rp⭥ϯy50iU7h+Ӡlnn9:BE ɝoWղ%8ݼ?CϟWl)0e q~m3 o7۾j~ "1}x9 yJM?{ϕ>\\/WN^AI-XVt/$ZCO^[_ZIy|<~IW_+hټ<\5=::l]Ny%x%e+)ELBT*[6VWl,V! aWWU! ^ 0 ڇ]AqT\1Jj!Cs.b$ַVU`*P2ŀe+oU+ƭC5)CeRh%Qŀp eň ީJjjpNȾf7WRmV*Ċ mjŀfF^A 3)t*9It-cKIZa0k57cY֍XaF'AXLh4l x43K'yZ_RIqK!th*T ]#6Tm#\7PU5 0 F[[ƨRǺ6^e*04?g62piBC#1'w>ϋ̬>J`I ]^^!%]啶U%]Ir1AvݹUPT0,U mr*x1ƱYdžcﱮ9 R淩uصT85XT&ŠH5gQHDqAV(VBcຊX[b<1hDJ%g5Ӊ︰j_ ]@bV2ie` Z؄pli}൉ [XAC$ yeŌL&*l!+Mm (008s]jH3tLAb&*RT塯*]HU,*Zb=6V̈́ӡ&<e\X!))mubL:!EctHb;YauU1: c(bz@*klO3(MX*Q!lE(:k[c=!PUVQ0]9z*H$O $IhWqt`[i+%!"Gd-6ZTir*TejfXt&1*`ŰH޵qdٿBmRE0 bMe^ IH) \RXTL!juucJe5͐jPoB+"X2nP(SPֽPc @ mHכcEfJ2֜A9 qG`&b1S|bDJpPgSO ~rP`=2t/Q*ACmj((Sѝ E4GRF U BH( {/u 85Wڮ+a/uYhQUDI)be+%5o4f^n!GAU3_Mr^KYJDA 1&dY\D d`niXEjE},BZI །/n{1#.EVz44U ULl^vNR'D̿h)`V9v&fj-kv꒮ဇVՂY˺>Hh>Dfxs:p~~t*&{H\C RbtLECNZ,J茸pΠ`Ye>ZbM=A!%DB>hAjyq:Ey4zD z Igu $$5(},C)y,a3m>%TB;뒄 Xut kR([|Pm!ڣjF,F,-;#hﳱyDP΀He#kvqAkK.C$ fj2RA ٕ P?AjDPqkU'P2,,{njpM"β@ A1.'J s_ 4ݞKŎRliVԀʬ$޲(m@VKKުhQEx i QIX@iT_}ŷ%e SLЉ![[&G1~ vE)7 &f3[ūɦIgw:z]6_]ޞmBM?tb‘.V\*&#VY}+Œ-wwۮ/@NXOn/omķm5MKqjHLCDIS# Ϛ9hq ؜g; N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@2=&'3q5';x'P-N Dv@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b';N )@[ Sv4N r&NSt!N v@b'; N v@b'; N v@b'; N v@b'; N v@b'; N v@b':F h{6%Rkal/xLcDN',خk7-Dj ꔙ;z*wL[+zs줛3f* iWGۈ3!i0A4} ZmDi hڈ{?e+vf4tEp ]ftE(ed:ARR"ث8RR)BW@ :tJ ҕ1"`'h*7"ntE(a:A22?p `?"a4kW@kP*tuteэX0"ΏmpC+T;4N6 ;;vXF5tJ/X]"]yS1Vf,tEh:]ʝLWCWA È7Zw}>]J)UJm5)L=J-=بucnF KGm KZ5%;BȡgB{3ApecЪg{J=0ЕdԡtG?"h px*9t"1] ])MT#+G+X0tJ'NqQ$vZG.xxCz":mP.f7wkz⇴|olxDd^v&#X,W^zoXnv__> 6]7ϗ@,Ͻ[Z*~>JewAUu}>_勭 mydI|?ymdQ.54 _xv%$;7۳-t!Q1Aā.V%Qg?r8_H.nVJfݣݶTNXOM.52Bojr[dSԣ6bH*s}B}&E߯ؔonwBՇ(b錒KZ.:\NByDpm+77  ce 4sa9Աz%r@j}l>j+w_%_"J귝f19TiLHϙ>%cl0;}:pijWu^B(& d@QC ʪϩXTm ͪ0f5dr F86 TpϾ\2kƇqy;KF{\zK󶿎lxj%xoCzo!.F]G';fs=9\\]o]كVz 3WzA|A~N[&hC[J|;([wLxm6c~oݷO.az?wǧӒtr^fC=~~|~8t"D 9z^Ts$(@F&(͹JAzo}wC﷬tn{.G>v9䥡[p z~nrAf/4B;KW0GV0G.2*V[8l:󪎡"n- >j|9j:-*OHђ'>:ڝ~5.[S⦌q~[wy|T֡7AFaE_RSliEݷbiC߳!g[(.TL=wN/}ؚb\lj07T-ǖ[*[SʵJ61"XP?˦eA cՐ&B4ԳP+4ФDb 3`?6گlmˏ U//= TdĞAP;,Y$`cIFhg翊o8OSFrF {_E+VR_c R/xg\Bm*pj ZxINr_?ZX{ bci1>0.Dz[] )i`H3p }=y"llݣ:/G|9lBrmMu̺vڗc|?t |?~HIx搂wM_iŨS*D͹PH`u yF+|EDTzu(7Du>?8E%$rJ)r$Tװ8SUЀ8eivqqEƦ<~(1B0;uki=d b>.ݚ&=2!aN6yddomqɧWgm@F b(4!?i4S $‹?Or^˅{.ź\8.U+j-dqD@MBI*#cS"8TV#yJĖ]R '2$f\ʻm-Dܻ 9X[ L2 óczP Av4ЁSޛOkִ;]싷o_ƕN>ŽqfGĎ(k}G# \o\4]4/Jh*z>qKM e;UT|RJ4$\lXԏ#[qc>[(0zJR} !1WE}59NHRtDVn,YZ*ʭ\b}=5y7}$wg:oX֫H߽%OǶGlkyز,ۙR*wT۩to9v&{iӍCJ~] Sm ~zERķtVN8ٷszXCn|_F^.1ݱcO ;#J_.,(f5q%hS֫qlrY&$36;ݰ_kӬJet}=;3T,F+;o^ID;>cǔX.6VbS$ i6%S1ˇa< 3< 3Q>(ZXX_JCuUOwS]9>з;2 E Jx \f\|\<<fn?<|:{:F޻&mkFq$9u`x|1]kt~FGlÄd̄ާ)$D$>\)LO8U~t{묖Oj@D ث2*.Tɴjh`̐"9³ Yl ]pGewJr'֧h$>=;~V7/ ` @Ɓ3QSW\0Xt h*Y3 ^_(ψ,myc30 FXp-/Q ;@"jj9^+M}ˈ!!|?2gY{+ ]a`<&$daQ#K$ķbwa[-r{, cMɪbU||gó*ށ-VwD#@oflŐ3 3xĨ CK.'""{ &q/|=8WirBzbT2Pw$6I\JIHp(A{ gwM{q=1xTol VrGM`(I2 qƉ xӁ*N쩾3ɒ6;C-ZhvS ! C뀺PԵf&Ȏ.x&.E9w0ٸwzȌ*FBXbPVF$aRdI/l l|״k5S~B١ӽ0Ict}R'7 4$+D畃 Usy ]n"1xxdR(D ϋ[僐23CSTfgNlJi< BIPJy6IZXZ )\P/cP ƸG[ ix_Y)SkS"~4`8`٧»,'.^}|qhٲe8tqϦb}a5ߋ6dx؇)M4t~pi"!myߩ۸9yFt;ʙm!VjN_k&79[kk\TޠM-~Z(Ep5ޕ<.lݏ9!#~0h#C; B}̣atQAG(Ȣ+:̂_ s=8|>oBbA  zp'w(iz3m}xԋ<[ϧ{|7MMwy?ԏ_UFQw~ܼ0ݨۢOWf8@oRq5.Nl/Dݪ>j4 #_8?zʬ%+A6(df!XDD)GY.=%>}҆I ɎWڹ [#%O GHb}o-rJ`5*h.$ b p>d1BwTioU|zRK^jn5Cb8ˬ PEB1 {)k5ݬ>b~Awج&fzD6&[xv /Dn/K/=Qx.RccuT^G%mK)ݦI3iK61D+i& E,hTƅ bk#Jx$X\fZII.'&WF:v,vv20Q*W:̠X 4n"m owFx45  j1&XQSB!)QS?B)j| Q>piPUPHB ;\(.z$j%Yݡ{‚B"hO|""L!"Q$!}[M|CLoPaS5IJMrR yt6FH 3:9,蒌T;oZ-( 1 ilhL[ (*)gQKI (d^U4C46k{)hE)_mlep1ho|"s B́KЁ%'R")9ΪT"b`B[&1- 6lJ:f_mSbMqUځy=ve. @}&g( Dh,@{Ђo[A<QJOu2}sLh| #<a>bU|d~!BpX7E ~{ozo9x7~D>dؗXvK.mnsڎE㰊@7ï޷qkA[Z^z"IKx֒DS>eJ!*RJ 18B븩ݺ:6Lt̉ͅhT&H4V;yꔦ1|B)VQ>uGv*WC!jl %ՉKYPS6?3p:Hi0,-ěh*,xQ(2q˥qq2#f䏌+,K8rLHk] 8WL?)^UQBj.{xqs5Gaz@9m.>|;%.F.©I΄RpK-M2(IhUm烠X>6UŒhN7XfF850AUI|z6NPPz9Bns7Cng !|,-$MMf9H(,Jn fOÜ1߻eQi"ưZaj%ӻ ~Whj(Nڎ}0]_{mO%U V<Ź[^%w=gYv]]9+AXgK՚]$A)klMeݺtKk];Y֮bN]gr5>M߮e7!ov6gk:nC 9A¬W\Ps:iM/l{CQ"Z}plw[-mOdiS1& jFk4&XB( Ry`W}۷&b`ZZ6LW] N$c (.3hXc=3@iGOX7A3RH"!'8EV>\\->*B0l~gbJt|b# ωs<>`t:̘'ō5y6៕͛WE0(VWp;;W.Gpդr|} m$>Ghch0rOz8_Oz<wћi#GNmԶR"lq6ÔB(䮑-TdfuDi1 ~FT!N7_wy{{/|}/ޞRfN_7_\ r,m p`~% [90r~R|eo\GMsWlsӺ4'dD'.X'g[!;sgAy;"DWCxӡ u+ISnfㆋXNxGŅ%_ԏn; ^IJ> UJfBhaT(*N\2T"k˽ SRVhpELjBqMr)i[ȩP$X50I ! tR0&N'w:U+زdt%ͱ𐛝O#9|ҳ7ŪӸhE5{T&JD9]S<6*ޗ+'Z2Ü˄WDҁo&;;6zN40fC%h9J"ӈT3+ʞM] .cϥbvbUPJ*ryN T>ve򹰫 `ή2R dW!VĬ2R5[qP d Oi0Lc`MiD~.|MvJ Nz/1D'@ V4T,ϥ0Dϣ:%鍡j A<8B(ޕ47#³=–Xb_#c"jRMR1݉"(HuhI]Y(/\0h/ Y}>BtoS8OU!6Hmy !`U)igK1)JDtQBRNh>I%fP7͜miK1$D `GЩd#}&@![fh2dE_Wu:6>'d:?f㔡vB+Gp~P~Gv}lKՋC^k+Ei<{,f.B!3ϐ/6Q%{;'˧JOOMv`($~g={/} 7oR6oxvҿ#qQSCY+%$G؀,$FCR:Oǖt{sdioPsLJXot҉ '[|.2( %km.lM9Jk$ B(F|JDI2xcQD)gХ5Gl&nCcbpXvԩu52C1W(&zA5bt&j~K/Kf3GVw1c]v"[ځ~L& >ҹd)s::6gŌ &))p)1`@&Ș()gI.sR5l%n"*)w98ǚ[ 9H16ޙXlXsh`{m.%iBYHۥwzLAM9Mg(,T<tݛpa%'JSdN̪bR(쒡kY 8UY3P:+W}Kr^yWKP9a$]РvkJy^kz_]\PKJ92 u-*5H(3cD9 DGFdcyб#ұ#*eRu;p8kM͠a;mAh JL;GeEjY ct5`K7|k'ͧo5TE} ER,owV$b;07bZR*Y`OeT׺wSK A ~rVR}>Yo ?*EЕ rgL06v09!Vb(Τx3)8-OI("98T(Iu2%R,V'{gri kme9#iL+R$lZGufio4%W#+J|(l&alW"(4xtu@6_{_7ﯬA_~Uqrz;51X_ի)2mG}v3ʆ϶P"f9>>DЯ{r>KϱIƶMfh9, Gѭ[hd3USAqzu]֓l]Jv+O8\źaAl|Xj.6zJrS'|%E/Ŀ/,-ђAO}bfFׯMo=K2W#[W{_;_,a7v]\,Iz88O^N7 xc(zs1`+V!ѽ{tx3o79WvUnn kϮ(!][ȭ&Ijwuf{׵/m|bt'$uuh{lG7mv:}ٺۆQϷ6^wS o5gsoy9π-/>eʃ$e#L㯚=m׌v~V׈ZCugf8x0V aZON;%p[һS6jL6;&4݀3GWQ%spKRJ4*cJPH#ZYPEBa؀{v}d9vIc]O)5N{-K\2+SuTHXg(sLmFQaJNy6%%iYJ7Wʨ+SFZD +j3qv+j> ݔii>w:BIɁ %bamo@ÑN>1A+2bͫ(WAء`PA]|pS  cBSdޕ TT =952J/ŠJz\aaj#4WifơXcEi2 U47܀ehlS}N %JF-"@6;o!a&F'-[\Y1@Pmpǹ{)*lj[Y&PEPE*1oּ8&t<n6;Dm#ڢaK$PYCJj-}Tc )mc^2 -769_rۂH<$`f( [QgXVeeXID6Y:{~Uת bq("ƈ(Dqk* 5X©t OC2 RCm3-kQ3TDћlNg#[8N2NL=#e~ĸ8[y.^gYr(.Ƹ\pqkERjY,crـT= f@ ( עJT J4aq(xS9 ޏoQ;1 / !:;<01BL/u.{Fj&{tT/=A{'I`B8Qנkh$%/ZD^(ˆ;%N"ڦVPmY_ԁ+8 |n{Buby_#%>BZ)Y AhE xLƱ2")YB@r0 ‹ Bi< o F9f#r  ;VA6抍1s5>3h֙RZ"d\."T 9S+ XH:`rZf6Uc`h&n`ScBpD9\_|<#~rZ[[bw6`ly_w+ X(d晊UV@.@B;Dl_HDYb63@>6Bπ< yM|Dˆz̔%(6@(`&Li5Ə&?X5?O?'h>Gr-?$VkAE1F RQU;&0c8=ǚY8OO xb"T\鼌v,u~"t4vS@^ xxOzX\ Jn2RnYRDͧeth52vjw=twH0".W޾חCd<1\hzi9;dLBwj'[*E~'j6(G{'so5B]E>[N ҿ x+v*`p{79oߢ|w^+OQf&!'T|^/}q_u|iWL}=?%As3jHCtڀFR6(4Gu0ēY :at(A+LWdt2[*5$"#BHT#N TsJjy}b]*cf!ERCoNQ2/6E ^("Kw4+?۟J4-EHAgw!!]D4tii)#?,(ݼ5>\; B@pTEJp^|B)\S-؉2O7VFx/ONc&VmpĦ-Ӹ8[2 80qp !8ӫt_hY9':c01^M W'9;.8M*ogmWO;C]F'#/M𡣣ËKE#()ah׿E.( rt|F iV7f4yj/'kɅWhEs> ɻ^s ?_&>x̖4 ^J$#-F,̣ʛTRÆ? "ŇӫB<9;orv$h\J8u\N#ӔBoQ(&P sU!?հ=_~t#&ɠ]7߿o﯏1:~[8 "@ op6fIfiV?_~v;rL}5 nʉ57mkBrd,Წ3_,~^~Gjho:4XЦYO`qr qIٸ,6,Ċb<#Յ'ު3m_k%e#IF&q!(@"ɒap ΂pUdXER FzjgÔ k'IJ 1 fQfA gIHfߟtr[|]Ċ|]MhGr?֤KjUe(CvV.A8RB:$j ,h©B>K&^2&lU:0> ʲlc"n%م^/-I-2L;'#/xX*B 2d[f=7?8oB챧#7łKWߧ:ZOU9OӠ*Y'b15AU|$>Gޖꋶݛ?OK.NGD\)~j lΙ> &<^mBXO7)g3SڍZ:!Fi)QګT[R4tF9$u:k\ E֭?z>sͼD=˳z&5C)ET28/4+2-#i9QZmE gۻMfƛ1RD,鑽 O#c(I"+qCC8{ %䰄IjU,c3>\7]bKBdL +6&"_JxLq$I1&Y{Ad )wyFQpOg*T#R6;gG L9jZ!5(Zd@,r)zȠI`0jtmOݑLjW3 Y:tԡrpwݹ gVhxg}>*5欃e!ɸI{Kfɤ$A/IqAb$Mi êmCn_[S|5 !ԕGY,p͇.v%ȢD1xxQC-\[I Ra.~:ВTN 9^'N k<֥RRxl%}M k:>.]ԟ~ 3g&0|S+}DܬlAzAOtkt1ׂ!r'nvv./8l!o> AM'Ws.ƝЮ,{kdZȊ#G gWTEh]<%~V< C2~2G VkEvݵPkq-%^0*:zJ-ǁ rq*zw]G+ٳU >+XsYEkR)[;]-dIx8}hxzFp@ýXg¾LV;Y+͵.֟ Pۯ?}_O;crʵ ^Tsu^HR04V!-DJZ'2tIe%rʔBaUzٻ8dWZ){FXgvH3:zVD 'B4P̥ʌ{F|,fТmksc`{m~N0.ƛu*0?R0 +[] &^|YV_i1+Cƨt$ᵀ|D]/&x,(2E_mfbo>'\7˲uQ×*wa|P!}RݑlF fuyf_l>+:&݇/I?OcgsTg v׍Iƿ~9g#yZ>b;͔ H'SiL(ARSR&VI)Yܟwq5;;vhj :ʒB_n%H{֭q2k򣓆,Z/Bw~Ť- @$Y gtL&2)˓V.fd3LgΆe iC}VƯY<*f[.=1ҹSGtr2S F:BĬ?H]*٭IEs ϱ*h?hٸ/_Qs 4JrRޑVPҰhlB@i[9w;NNLT[v{f;..('FFKVbmP0_efv-@8ٓOXi#FJ( JZ%[m"L9XmEdH=#uHکݸJ)Kؘ5r$~$cWtn~i1aI=pᨕOu z|1䦗/ͣ_rݨU*=X4:e=Fc0AVrOFS]E\caA:<9`rӯ?~~}ӿן>'ӻq+0aEyo0[MO`ꋅ%,dsZrapFwڷ(Zt N* 3ٟui˲\[e1y&C.zFkEfܿ]՞Zs+B@}Yo.<-Fu)7Gm;Ha[W#dSC&T4H 6`18Z;V5m#=va.\{ŀ# m6yw3RG2t&cJPKD2VTx|(*?LgL-w-] ĥ88ʰ|>} wj;){dgb^y¬wo@5wD޽CڎS% <'-EiT.oȝ[G\#.rshvA2O e&f@$dEGFO)"˯s2@=ҳ";7 Xg\2(LEzJ\(=ҠO@4L Z{忮 U^~7~ט/ub nh_~k_mpa-jn-͗7ޤz")]oַ;۝v}yow*S}l}p? SE=+\AhDp+]iLtUI*+vn\5K^݉@k.hDjop*[Q)499QS)L9gnZ)JmR>9I,B=-iM bmWdPJ&+b%v5*Cw)wR V/j (Kx6%UTJ*6༧Ѫ"cL̾FXL3̤eqQdx =nιUgppg,Ej:P % %cٗPd7JJ(!cOS> ."B2!SŝyQ=+h*DE@1u6 ljv7*_ޕj=XJ pِbTBVh5k2".+Xxٳ@,pR&7OMHVe MDE݇K9%)^XXJRJO+Z (kIgT|;-`c1!jvh3,L8V`/8:;UM&@,YJjkD (؁c+^;BVh1)׵#"< A7z+ܗl̷=Jv* p%+|^w]*xbFRZf;҅=c1֤ɷ](:g)~<~Ҵ4[^FkHyBBlD& ̘cb_[(,L']`eh,S Mm԰6MtK,U> VRĽg3V([6|5Kq D)6^2\r؍ Q .RN5Դ{0'ٱsCn'W%VY=r2͹d;4=tV-gUuD֑ #V͍75n M2܁d 1dCM%6Fljs^Zzbs( 䌷M9X1%db.Eڱ FKzu{xcM:CHq])r!dV[}4pB48RN Ue,IEAh ^(<$DI98!P*cEN-{tdugONgޞ.1RZ )7lr0g?y~7q ?%cԆlcuƹAQ,Ț\0GSN(m:B)b,.LB,*vHRP(ȚmӁ$u}gӹWY/OGIYluĻ[WSy_{OYztY797.tXC0I|c"QL\Pq;[F]9W˨,{-9 -S1{taW0@bqAPH]mI$Yx'/K]lzK7(wFbVBf"LqŪ>,ed &ꌚJ b#L0ƈNAIdPE(<  3;:cl:wevaeŃB)v>Ŭ+$ =f其6~SISV&ݛ6$69f\EZ(DAvH%2EB\Hdr#F̂MȽ׉Wv6dt%I ::#]Cz8W|)<(0?DW5)WxG]uM `tNaQoB)Z+bs":w 쩳!5WP!P:TQf#/)G |_ "P(e L}qjofnvXM_is~o5,9cAc(&j'}} ȀY%+bPhQ(Yd(u!Z%&LGץ-6RSYf% (ʲ]I bΦs2Ddxw^yY 5 D/(Ͷ;S/쉄Bٖ͘%T ά9{tM[n9cU_b/A|o{7y8o7H"5CQXv`Rh*_- YFlm} ,{6j %cRzl!AvHTvBǵ¯M%tsIBjzsemyMNp8 ؐzjj^7>zLLg* !l@Etx.uV:hW=+uV8=|,<Ԫ8ҁo,,6[1Bb ," ka|cE9\T"=l닇ɟt ,]NB3D73H˨% kmH@?z~?{lV0)qM l߯z$E%iQD4kzZI0ƈ>+ i6Xjp@ÁNm.8ɺ]oVzy_P[7p"4"v5'aSOrcNÄ+yA4iC O ;&Lo5AŸ0\9#%NXMlZ%dd=&GpH?U`ɻj`5}%i3THg N̳!.eS3XeHze$[D!ѤTC''KkGo yF5 p|G`0 |9#[!8=dAd7Lbi#dzzNsjr(1VP:*(ÜDXcdQ*+>%(}wM5:pH6dv]BfrxR.DY0a<0L*18ř2iW PDB\ k$&n>0 5^[B# 8Z{Hm5BȬ"㨑UI$ oS0y-au51`Lc/)'!}^8JME}O/:ʢ&TH7_#ω nӛS$KaFv\-XLZZޗ\Y0Ԯb97wdF &!u*^-\%j-~1`BT2T?\\[qŝ)4].Xl 2zaZ~d6_X7?G+I:jpe,:uy?96t41R?G)+%zbu^uM҅aAc6҉[27 Z}/sSy^ lw`~6Egv[_LMn ̇V%ϋn0˺\$$\4{\.]1 ii68dsSM;AI;i^ۺ忍mp1^ GD@E',nM IS6M 9k!__y35սJu B, fwYQx..'mmU(m5(;(^$n)Iv Gp od“I 3韛/oh|Ug_1?~2x8;𹙱ᆵήo`=omdbۨ'kZIq <+]-Rۦ`,S S?~y`DZA6fXe';4U[pwM&چ),8Pkm}zjztmnj.R thl.~aŃNxBD6 #݇ /24g}<3:MVJ"3*f$ (G2J*ERkCYxVǕ 7ڙ)!C &8 i,8p^ Rc3BR8 (JozzixQ1|'1bUjbd wGnYr=xGpעz=ͅ@X}I"b%㖖d&y\#z \pt4h.M%hRhAYQTi +ENX`>bcB1$yk9"|:컋ճ.E,Z[.&Z"(X cB0GIfil:jBV7/(Uu_4?,S{w}"U1,{\TZYykK+98bHP6Q>,}!;"jo|JE^3DepT*tJT_P^jIp 2fV{C2i;Q!(8v0)&8 lfgYM>q}Qqd.ؗU{]:WЫ"-kOf]{cJJE`~)b ¸LAO3=Nppe$\$ %Ggu%8;e 3X@ TiiH`hAE nRPƭGM f: KfH XE0L#EDM7{,|{U=1:vL:&5q [8REv ZN% mXlp";i^8CcGcRpB^t@uBY[)2!ԇ(wgyqBH 8N DTRJmɌ%8߱L/L0&ΝCb{lq(=xiv>{vl)3JGp_ދ$G`"`1F: lp6HFlP-U=6PҦMd'\i 2DWoLI'Ơ?Hwk7h)mu4}͆,=NSfzkԬf"PZ. 4h;>`tz==D 8TI*>rD{Ą f(L"NFCR4 jxT>s>`.0 J" T)J;ٮb&0Y;\yI䲇 |V$YQy,c±Vi NIjhQii>*,u0!+2 %ZE@H1 N9A#%bdF/cLʍ١3 s͌}PgBc7c72^C>Ӝ1]}tXAipPMqVѺȜH|"g 2܉ 8ĴŔ!dq`ZaiI^[*㐜=SJlR)+`5I $Rt!"'Uf&vĶ_p1͎=Qףvo؈<'r+9uzc=DIt0"KCȊ iUC2 hG9AQN@=rTiLf<&v< . b/"̈{DqӌmVr΅H6y8E$=՞iΰhu dho8XG!1pz ",iP΁'MFb@ܜ4َ\̥cuf%"Ɍ7xmƔ, (EX0tǂfǾxH3!aӻkg'4MIp}e?2e?9`~hR $8{pOO |J¶`͖pII֜9 ho" oOjO`~ͅ4,~60D;L')ai' ia*4p0)9p{cAXWX#l*Ks$g*I`W/Ɣ W dUWWIZ}H + &8Iܯ*I{hp}<\%)eW/a"=vRSVbŵTiZZD1u%!_͇_?+7 yFA}A=MV Q)`VxTJ=Wڗ}ڢiqfWQGG1{@- lAg&r)iܲF&hb nlsu I0\vAѻFE(CiT=Q׈VhHL]rrm.rV0:e4|`H+HS ssIQQFEgH5h4|e) 6 8f4EԳjgHyXud%tl(QsY'Ԛ%woz' ٛ ~p 3q?0L]<~<3Qu(p"FJי*1;.19Mglߊ_ ORxÆblrdJj$S .eDL3̓ftۭM)UU$ q79yB,%TJj hTN6

xyKzu%= )rF PG@1-bcO%)zZ,Kk[We}]iۗ}Z;=_s;'z\3tN ^&ٹi˱O?{7Gџ?>,dѻG=nd^ӗ.-иo\5Plt3AwvǞ4YcfdKHퟻ#DxOg6P/ll>wxҽxz_h qwkh&WB58aE7џ3\_+_;_-e/.`kQT,BL$5A{wZ06.:cm*>bg & B͙fB)C:7q(s~)’Oꙇ'0z v{];jQk$8C7>3,E>alZG]~;w B86gTwS_j 13ٌ H-aA^t6$"[6 ,VzE1Xg|&M`Rx] zJZвEA!E#U@IkQcjoiѝ8Gqy`0/C'/Ѝ빳h>yx]K`_'6l`5y&UoyӫJZ>/t?C B3!L67b3l lfS+7pM^V{'1>am7p! -hSJ\QY;e)u\ kl V1?׀[\m+UZYWw{{{T/m0?&(>jpeKɓCYCM˥["V #;w-s'ph.`Vu*@;S70цmFQ_$XK@)cS@ )rƎafq~mOЖBEEԌ(P@7/ D(VP\#ZcU,69l3C<hk]QV *y"{MI-~ÚOQAC*dI>X gl 53$:k$*Ʃ(#24@"bZ `ZBߪ1I(I )>*%B248@Y>Ge%­!DB.F4lr=J\6>gr`QPzN@sмRlfJ@cZPCxp dTfVB = 6)՟;aPllzb̜{\o>>1]\z7Нu(}{Qj Zt(tSh6OE{guPܻVN}]dJ"4NE:Ĥ,y^U oTbal9^_Zfn ]>?vc8v;CxUJlYOfeԛ,p!ÑCl~#2WX+!̃k@Bvs:1\tM~L3k3 d٘8,RPi_t6l"[<I9+79J(6@X (B*Xp> {SRaV #tk&_JpR|mur'gO68խśnb LƛVfyi71~RrcF6MJĩ*93wԙ ?~BThU"b9@A-M.͂LdB}gvR6QO?^qkmyxڽ{j4/'MW Wlzz Nzc-,iQq 9RB n`{7VH ֦Zj 4WٹؒTLy#i=W{%_Ʈ,_+Ο#nLݟlgFH+h󦺲7~PLnIB/MMl ٘❙\͙\ܞL.nEɮ7mFab83CDhkʖAj5^'c+s)X,0ll"b|U2ѽ+D`tmŵR1V}DdP6ź+(:5y\o} vk]z²bdc~2f`ߗ%0{: cKg{8W^x% NfԐrt~E KuUu=f䴜vJI&WaOӴ9UgUtu6.T0J3"ta ` G; t؅ O_ٯ QγfZ y (`M%s>> T*Z| (݄n+3s 5 Q% $+dҒ1hf:H(c壛"_œ H6 u^_eAiR%9 Dtei0{p]*қcWʡ=Zjm/hХaa|ޮ^}d\-wXEupm9 W6;dGz;5kI+GD>6IYuv+'KE:yz}0Ey)faYwnzhĴ+=/^}w^wf ow 7.w^s7_M=H\n~S@(1O(&l9Zڞ%Fzϖ6k [旖ˏ}>5,kPsAǘ/;K0 p11FǬG(e8q4R4"J98#pj9o[w@ͱk,;n&RI:$U n\ja)0<1a |4BFl~-! b <* s~]`DʩR ˽)3FfF G>4LڝPRodBI]BI;l؟=P2)!;جa&HF&i[RzEFd>Z7 ()&=)8pěU`̙R}#colٛ2vB3# WNs$woni.oK$zxGr'Gle G(r ÝR CL[LB6erqdskү,KqHʞne % 6H”K$@EJGt")<}K*桠vocGf_n;QX؈<'r+9uzc=ܖDIt0"KC#88dhю Xr0 .q{@7r6aԷb0 ">vED3"#"qm% i6F+E9 $IT<,@"j4gX M:A7V!!Ɓ3S Ld$&:łKsI&94E.!:{[%"6Ź S;/RhNq s!<*FG KG=^pPձ+Ҟ{ݻ?cCvsus\w~|&GJFVRO?)s3P8W8ցtzB%O"~+Q4;y zDtmaz59fi !J1WX)SmJTUeG?>z q:<6F'@i L@74pto,"g@2 Iqô/>B=8}f^4+J(#;(ÜDXcd5 D7?0<чMD=J([ ζ''yO 4ha<0L*18ř2iW PDR&-HH OxTw]\zN;&fk)@GE(-f)0ϳofm))UNG;,("\VEZu74\ɩ3X{4nEgٯ-5'Cjb0'e5F$1KC]aq)uozT~wH\\9=!3V˼i]x鉗)l>{sޚCjjL1~W/MKֆAZa;Ή7 ICHW&9Q mpfVl0໏p +Jy4KNWz,whmۚۚfNn/yF7W #qp-LM7,lgLo\Jϯhj|˻oC,.8MGn4߂w^wߖgfk9M}^NŃZudOEnIָ(ƌռJ*ްoӾOtyG۸3~41wb< CV9_E7?׹|V r]=(9ߴ]AIjx *_ΠߎyVˤ2'zţNF%?чhՎ\sFu~Y$?Vؙ@&R+%Q@Vr|EF@r$`,2P$B yhV/hׄ?!Z_vnN 0˒C 1&HK $Ӏ<:9[ŗ zixa:JLvoksۯ\xXPQR }{1n5~u&1LgIyF@R+cm2^Tr/&^Wޚ,T z4}DF~Lg4roF-jhCYF@X0(ei@ x)Ye3)G,~ UH-\H(h=ġM lpo.  {e?ԗ_UFVw~JP~nV節gIQHQwQlR֥VS:pKN2XuA)dtb j F.pD%}>R>.6yut 6_-4 ChN(," NQ$4E|kHj?3A烽]~jyLm .ekˁ[A\3/OS+s *hQZ8Gӡ#Dn=uZfëZLj1ITc'Œ Py Zbg 2B'RZřQtU"0 <h2"o<)Q@ؓ؄ A2 ǨU1kQRDXp3sFΖMH|wrҙ(<0h6qÿ9!ui*[7&up,+=l:.JPބ!Τ4UMRNKb+Džexb|iÅsgn##A HFǤ"9̰4N+ 1pD0AP8@溑c`^"'̺%(΅ȁ B*(5G`q_xg fQ6UWH_?ִ#9etS IȚE5 F>xéW  wd~i?;sqGn=՚s5ye0!xL$DrNEpda?;v;MKz<#lٯgg?YEp/Q`"(;N+`8JA?`kE<}CTef+PI0EdN.Swb2\I9> C7ڟ΢N[3 rT\М\ 9pmsC1B8F&Ez+p`ؚ/¡tRd\M#p9jv(> ŋMo3'RɝxcyFMnLkgelz\x,#F AjkU=T/fv8Y)S; +PH{ᡑ!(9RyR~1iT&H޶].\^NuAGm_kԮwDe2CGNIF1K`uOp6.yӥqSk◺AQ`|/?o߾I~~{߼xkq30 ʞLOVjͱZJϳg]oK>#ve O&4H[qb74(j/A P^w9-O2&00^;4Uleo,-WwDžmu׾ED9Ǒnq:x b@O9kUYQ:o:8H$وPfTTOR # 35 +⚤#}jcC .y=wC#\>j&FRr,y91 CU3^t+5c䲀mJ0vY4a#A29HysQY%*)=td53.s_竰*Ն,O@ǫPdShvÔՇJ8ɲէAB9xoK},:ėDC6F=%mz)mr'_Þ̅Zkm#GE˙N˼_a7Y 0Lfe30Z}qő\be9MKH-WǪ"Yuz`VSitR,X` \LDkE  DDWX` * ]!Zc|+@tutenQR{gNڛ5^AG5/7L'WȥNرE5,MMI0tT%a'F(3!mrTW Fߌ֊Y;S8'8"?:Hp!iW_[I9 Jk{;km] eܵ$V{nCKUz%[Ҙ6pNC PΓ4ydCAQ %xj'uQ{O-Ze!LiRPQ&"#6ȁM l 0gY>PB[A"Re`2Ϙ^B[߮lr`/ïd!yȀV:` " {cgF`ߡ 1~t?cJkqm!_R`R W`2p{jKbjŤTKի_q, <2]8tzV-Z9ЕtTS+ ]\EH(thN\]1ܯd++i(the#tu:t%ќDWت` bB+D{hk؝`N$TDWXp >l{+Dt(M O8WZDWXm %}Lh".rkB*0 .ƫDRNӈRE~94[^#ʣӕ5qɫtC[N;ЕtTS&bDCWP zvwBD:AbLYv6µ,bL*|+Dd+Ήx֪fϰk_tpE0tht(MNL]!`K+I0th%)ҕؐrWj ]!\CB+@%tute:$\CWWjG{BF:ASDWsRBC+DKȀ(tutePTDW` jL0h%+kEܙح&n5{sIp&950q}d ܴܨeNm^5;N4<3;J80 T*;D VI{p@u? ,\ζD˸DUL-Ԃiz#R9ft؛ȥJq+ꩪ>~$+M(th !Nqt \:{ QHWHW\Qh@t~` ƻQ Q J(a=8#XN0p6d6 ҕ3d<B<`J+DXtu:t)RnLA|tv"ЪcUsCI=:д4TKf%6=v -gYJqƭ 2BJBW0 QZJs=\ &BBWVr Q*p#iH+Y0tp "wBD:E+ ({eK۲vkwX+"{nQAt"&!I(*TqWIɬG%Xɴ=XEvwbPgOH(+9a[ؐ֙/StkYZ/%Pn(tWy YgvCi_'o%iOWz-iT= w<^wMS4zX5lt3I>tg~OI{:Kdf L%"/?v@Rb?]2[qV"WT)Gu Q^ߌ+^Mǝd(ݳVaC.Xse ZO3r*NcOOۣG@l:l7+LoVKV %d.U\1ΝovNdٍ<]'j8_̾<12_?Wv¹ Z\7tL̫f^ô1yUkoc.]M܌ ƾ:_iRt 4'ߙ.כGؖPV #V t;_Em&W@H& ۇ6IvvM餞$LMsEvlX~1MG,8PNG7),gj|?z_Z 4 xd8KqHɵHX9FfceY(]6hUeﻭ' OoI7|@JS-w}O6Ni))QySyS8n iOqz=||TwtHŇv)PUay\]F4?.z(UH.!byRtEZjg`,a1wZ- Rl4|2#Y1|{$>?y9J*Pbݬ;rI ?'vi7cN"5S|Į>/զ-1Z=r,Z aSZ+av0(F:AUA}Wζa&xkÖ|!~3!1FM[KZ9Һh )FZjj>, {d4(9c&`:@:i=kʚCj**P۷mJ*&˪2\+_w70gys4&OO5<$|Ϛ>5$nžT+ɻz ~Kp=BE iԖ] gb%#n-^hSŅdbk=fk\:ȯOKҼ9.ϓ_"H%Qy1+ qLqɌێl ÞS!>)M?O^wY)F O ~q9jUZ>bUg^Uq-X4D8+vMkw*f_Vس-?A!Z%Vj<`So(3b(ђ1TPXkz0&v4J1E3/J)6,Vl3|I;oCw16"8[-1-Ȁb Զ2V9GEѤ1 -Vf0`=|PA~oSKA(l?[EaՇN_w.-&]xjTY]Ɋ`T K)W`̖<+ iI) /2WE*wFB)9>*w8(ZܿV'W6OŌIn_nx,k>/Ogt4?l8VŬm9uORͲ/gJҒ 2<+ɬR#2$9+TMj5Yj[Pʐ<9^w0HC >S EbC.}o;|noqH(Ky=G , "۠3vAD@%cqHCz]7M43Zr82sn\y I3$}}xK\ "K8)YJY}V'ujb:CYEKBN}Yo-:jcæ+TUo&h^Q 5!ӛQ>Eh} Ag]B,W̊|aII ͩ.@Ʒ*+Zd*Ъ6_jVO)S3OOfC\*f)lԹ nT=@W٣JdM v+Cb׹)ksgo 9{cm_Cm9ҺO8k7=>eK%2J{=ɖMV9&s Ʊv3SN30i;IETI9w2er nNJ,^AwZFq48x #M(ӧn9t;$:m$H^Ԉ԰fN3.TcMc;<N.-O?m[]!B0BRh#-励HaRy,K-+( %b?Fh?0fm!wkV{\ӗC,Mo6dZA0×/_6fn]}\:N@9-e.U"7!IA&Ld2_=x~1;)\d-'{Ae$hd}V7;Z^^ 7;:YԿ8LGJ^TeKi=}7ujDoE|5K( gu*D0c9W3e&`B*C+ס(7NecMZK-MBYRx=g: -Ʉׅb#D18NeܠYNljY;-Y];աJ~~j 3%.3#_ ϻn }.oo {܌F`]l0~:wF|I~g_o&xn`WRdbm e'{Oه-\ƻPӃao7g)I`WkCq˅ .v6{-.1:vSB(>/9 _vr—f%7Ą&kdV@VW3P^TtVi0-IN{fAfR8^vjgI炙W P}ɥ &6qx]M#1p{ ƃ!N{NJ6*9ؤ-!ĬO0fi]oAsj}8 c) oy {%XtBܬWw8L`'gzDZ_)Xr]0s1LFCĉRX^S&)/Uesg9?[kfN ӹtOVllExsyý98fuS KƸ杴lO3q%I1;sSP֬fl\Vwڹ8&Ա`Rkчʮ$'[1#;_|Virz-Oi#ʾKu!G,V(;!(b\JEG̹[ w94\ZeNY1e,ka+fVx7vDViv[FœkFqsU@ιԐ2S#`^ȄCY+eEl!}&\` m,VWUEU265}ϴh*N$@TjG+ߙ@5U7or3ai0Z˚?7+U2T6ʟ> }~,8N%Կ7NvL5>^+f@P|uVsԺdL ;G/ǜ!nkʾ|⵭}ˋ G#s3ю/L^ @ mRf8WYL*- X.qsm *q"qiU%X:i7n)lŗ'=k:^Y^2#Eow d=gc緒EΧ}˖'w`ҁQWU&?;VhLZY^ sT<&ԟ~It(]N˄_V1OrcƷb6}w_-6ȝWʇ^J0e2iMQe8^˦ns1*Oz-R $Ӧó=E@_ Ġ &2w3[x@vH2?ȟ /w[ld-mFFcF^I ULfYi-soQSte~UC*KʃD%iNI=n+V?!) )S)UJY%тVI0ȏqQXCdl $6=8"v '46ϋEL^?*#Vq~MFoWz]XcsH9DH) !N3(ƀ֟ ?d-uy~1PƺC&ُi Ʉg/#dwU*.|D֌Ge _HVwR ֋FKEvxvVj"RL|@iƁi̅^KQG[YFPbL>#q77&eh&iXRવD["z@$98 3/:6 pzk\TbV?;LP8Άy#P$T3y\ <0דh5*DejZx@d3ƃg[z;Tquޫ(!lE8%$ ^:%D+=GR7ƆE^ԤʉIj`H.D\POYHe[F*UC/#{J'ۑKI`%"Agƕ%;>(+ `8ʲiVɹ\LTNtCj JlwUfʦ^,H},0U,lLX tT8X/e8 iDT1@0zT/5ZDʵ&J$ʡ}q/Igכ .( ݘg#w<8a~m9 4n|\7},2 NáD?!.Dz0@PԠ+`@0U^ȀmH[3t8? +хQ%\[cA{%СcR%: # Pp9\io^mFEċ=Qe]Gu}"gpH7/_+ a.|2,(I)l|lƌĬ fOYi,>pV=y$>>*8A h,Z_'""ժ݁BHKX qM"2Yg/>^a$R@f~h1Zbө~b)q hL{SlV;/.Y eд2NAJ݁D P8NMLQ:nރ^XgxcJUjQеb),*FY<Y8Đ[G$Ĩai>]34^%l&6"H\?,[%zumOc;8HS1IJ\!ArR.RK㷧+U DuwQNkUz3̽U k\(zkB¾M!x(rt#~r $ˌ{F%LD1awՐsYs'r^\*a} 9Y8A* 3%i!54%YJthUK^eϏZ%~FqM>srJHZo- ^C81dLcc8b)&UZ&Ԋ)`9'@d[5`\V L/JH GV}Z,ե%=bdPpZ ~uSauHe/\JLYGT3EN(~2U(T#&d 4gK:.{(Xu &-5WoGcm;V6DZ[ GPdFe 2v3vZ;<$t!GLū{oX}y:~XC[ '8H )I&9XnY{M > =[^8GNDWKk!IdMRqwemyb1VIb RDe(I53ȭ}6Y71\`TjMg`tYv'K]ONW9d5kK[Sg}̛iWj/f>n'&ۗϽ_jr]9ٔ k|MR鼪V;~!MA,޸-'Tָl8$T*w0)x.rIQb -+0e 5W6y'I6">:Qe&ۼXwXQ@o11V;BXv) *MO8h8q_)6 ߞy]1Ć5bHѢƯۯ-ua*fݠMw.NBΜ{:./fRL 4)W'e2_SR):juj}˔Œ͝#ּL#=3Xu 4[vj軻npgҰ׻hbzON 9hsIo>2,6s-mVP:"P;"{rVN[\Uk_H]`ڵ2kUtB]4V/$q8\yr_ʹ1VKy_~QO3wS -@ٰWRbq{C/cAM8\pk^ґ4de'/Dt{T-Sΰ௙"TQ3&oX˥-́7=QR0﾿;t#C߲@,Bk -)镲FJaOr_; ZPp-i]Ì]; %,p >SyJ:=Ց"TFoTxѫƯE~6x5Y~LM d}jmx=l0AbG-ޣ{jl+8CXK'o~Trgf^Lp4ǝ`J H,!16kHQ ̼ *6AdbSG?3m ([p^6?)U9?{1!%;M[?RJwe< JBYX86w+(RyYS.Bn}/L˚ 'WtADn&ޕٺJP  CKe_32.Q!zz)Hٔʕ+HX8ĴHTdnPIclFbi .Нy }6yۏ׷b6>a X&˔'&>H;Dߕ}o6z" ίN z1jz|x\j&BԽsLi/w>2X σ=IH!}_t??J`~(01uvx^BDW-aGqtFdy<[MkZn9%^m\(Z/x_m P&C7,ACY0bzJѠj̲fqYXIfVyy4,fh|ؓ;1@Ƈ"7?0_MKpRa;R{u鴀8Նk̅J[۷ E3OO.u<}AAcb^;̇5]ICn,7hMJr7Kם?R!SsOkwc@;bރw ,OtA,R%D$}x^1"b@|d8 Ԛ@KjJOΗ[Ls_ 0{dV*]tk1~{ȕCu0ᶟ7>2γCj#/h8:@ |ye?]^'hå1 ]g^XUE * _UPL*@OP`T. k͍D T&փD~u㽯 TlʏdN1-OJE<ȃ7^2TfeWbP_F8ˋA#F 6( S0WUuD~?7q(?uJy:bTl)`LP}PÞr T"5խ̼Su(i=PY@&HCcҲGo_0 [0h2xx07qg`h hFd TZaZݣ(RQ^Ka'ӪU)de꺰P8 [Hsk|)+Cԝ0L]KrnMڡ ] Ds V+)uFA `gOİ\PsU*9N.T9TMQb=cH40*6 Dy.aoh|bx^V&ҴIs籽|~Dɤ>0ՕyWFZ!/jʷ P5 @ylԅw4ҏvxT|F{.OOiN&B3%@ y9.ݙEi *LP*Pp]T jG<-D]yGzbl[jLg OD_YW(q.2Γlob3J*HX$qBj &QVj$fZdD Uy雿eZZ %Cj;]U=ݟ V<[ NyeBfYrBiTu>=fdlZEt*i*[40JGK8NQTasL*4n1+&COT1$WJ(P%)BiB@_c^LڽC 6i ~ܔBeVjvtI]_+TZ6]_tܴDMEWwrQ'8eи wLx0Wū;N¯MIAByx0HݐJꈭ>,&a)Tt 41!@UeA@I@šB2'YP[X(#}(X "-L.!=u-w{8i{gc@s{RF\r2[eIh=Ůgd94f\P)8%!JWTEŕx.Ԕ^(J~9il1HtHT>7Jݻ:N၃{! ͏t/؆N (Te{qA}<*]p5=HRpyX>+~NN9PL˅uƯXy.`Q{*c@Y)Z1P琭JS/NGBl7vՈM(0hd^fde^5m`;J\w.a +^%#Y>X}p.1[Y!.9F-4n1LjvAfOg.쒠B{jL2Ҭ!:ڥT /ω{ם6XqJ(᱇b ~iKݸ4b{ >Ktۖf*q=N% @a9gL|pq4WO c2>vtjmy&H+ viէn!J`OkƙzZN8E^^W?< -mjuP3I?1 ,oܣKjCC@sVw.;g/hU5lUy*8uTn#rK-n2)7hgپe ;..=&U<3{QTZWPTkxƁ+&a5HxKUX->yz}vn찝J2N{8Ƒ=[H-mT4T a 6,)6 _r+[eŇ4`/OO~T]臀I8y(dKV$)Q6ʒ;ٓCFev+5uz_£: "+ mKIȘIp<dž,dJj\Bsl$]W0޷c o(u>>\ӳhHnߒ_l_fn9(r6߸E9f`9sa꯷9'BhU5[B*x_z^ "(";u Dzֵۛ4>l~W/fYn>8]\۽C80dnCmé\q jţ{p( A {Bi,<%CSE+ N S|pPAv 4 dM5j^1$Tl-b *t6Pp(¬ĺ[ ڼb)- ɩr-zY(Sƻ&6Nu3j//nh UzЕ-OrdӞ5!`tj,D*"YTˀj-^}R}] 1:l<{MlCNi|F(_ӄn/Ԭ|MK~~*U ~H1"n3nCHqFvX;W)JrYT-<[ 50I.nn rAYUb!Nc޼G[ŐM#HM'A=asaX@4Nu?B3o r#l[g;Mo6X\?LLOAGPMn! %( u*I#@v qɨ5?!*'hx,ڋ)΅qXYV{ 8/yV Vo?hAQ矑8b Rc\9[j - M&?wԖj[.Vwiu#a# sg_?$!'E~m \LR[Z~ BRc3Cq}g{䓈Z<>.pPF۲̀2N4WȤ[-G/)x)_CZ 5Ex2}I?u!U͙T%mt jyr cmjrmT8g$"Sk篿R4 u*mP eG0G匆!E`kwq55Dm% 7&,hӚqDtYMnNnJ!)xͧaC1$S޹*( k0"d2LۛkS2Vօ<|t,T7G850<@xof_~'S&eŃ bHVh9]]^&/ǠqTby{A4=E Q5`\{Є8wcIou*y/} .P{(y-DYE X18T@U(zA^g8`^H;χfu؄`'_vNZK>ya2M1\qK anf0PU8jvc"/>oC&=сId˳a3S·Q]<2'\d5HlqWL.ݙEY FP TD1\50-4BօH. بLlxͩLgZKWL UYWsQHTnJUmj> WFPTY'H"Hq Q:݀RmjTMݹ]xMkHGrxZh#Dw\ rՅh. PShdG:fe(Q~㩰Fj5PT2!Hj9J#z$#^?,v T*3Pac⸶Jɍ|`3(T-$2RH}=z1:&&l@gFx$xXwЛzQ;Al\'T U&qDWZ?f'ǃS$cw~0uTΜ\;j&j-kkD+Ē2d^sZhcڒBDž1t|FR9&EF"Viq^*52]c~'vɁ98eAHR[edOk >tJ+ҶdD>rŴJgSgT8;|Ĝ{k>. Olתud`^UxȒS2Ҿ8!5@LBy{uu~b\,*3&֭iŴIٺ+X)sBfZ`y] p&YUM_-f~K';* q1N:n]q|(qHиE@ְH.i쎤WvkM:ϟ79b`x!\ koD:qR86 ';J:}K kћT&])7aZh+3F:>ao %V?C<|N%mDnCI_v:z4F-|XV z!Dg%Fɝji@[;\49ܕ <uZhtXg&eo"(u+ VEVDĕZKmG0- %h^[`[p'>ҭ8-ʢh=3|se!x=)gN!㟿Oyldl/횷d6]\L󉇶a__rk J?_ 0jgπD1/JH6A`c?ڬp\׹vO  V`6d@A!W $Ȑ ͓?? I!Yd pt5$i@C? Z) c28 ]=jq$x1Eӓ|e_"{{|<ir' G?L0Ɔ 9qT]=8gοMl|5WFv 1q>fdw?U8+XT8Hk1F1q𽽌 `ɫ{w0_?}~[` _V/mpBxfB_'v"KfEw+˗kÝ"Lu]Àړ}y<]vL jئ@7&3)j3/J98oaȺ5q5ez }uվؼg>L}= 0| ,_-3zNFPnZ,T`!8ᵙ`4|A0G-zW̋TkY˕&oGK 3_@'#D:cf1d/`ѿ~vt/7 +1jaO#/~3n6v咨5URG p&ҋE>aŗdP^022¸|ڵ8TpG f u&[N$~}R_iؖ`mw uZJg vϒWY $MM5;8+|*߹_nݶ93RWnhV]>:%}Իh 3OeCpQZCz b?{ifNs!) )E_"4رZwoRb쓱O+mj}Ʊ_GG0?>ɗ`e(Y ޽=J` &QĭtHkeŘ=,䅣D;L {c)wR*Ei:q-P-dk\tѦJz8ă&Tݯף;:)C7G?~];[Y52kBE{Xy(JIBkN600" [=i L"eFUOQ|!ѝE%}Ֆ#V~j'-8|4dӼnE[‚DmC/H"N|WEĚl\3M[l_rGwc)-G/˷1\.:M#N 6gY .C2rw:Pk&dBK! dZ B.6tX<{yEdAɴL[z$EѤhK >$溯:_\w NgB<0u M7\vZVv5?LκXضѾ6;ϤYIXO(O9 vG7u=s!Yz>GS3)3pVY00\T Mt@jkcQ X_xd x^>By^ǽvLHWm:n$agkIs#"J6!F yĻ$\؞{bs;*sQ&FM}a ;Tv$Әunzn#SR} IQZ qpD}6,htztre)xI^^ܜZљPLJjKFrB#)b v/dXu`LZ1|(wWFuxIu5?~w4WosЂo˰-bL-o%)S=X~aMwa.:NEc@/VA!*.M/kZ2ǽB8Ig-={ZSzuyy[( wG02=ـT0G}p d M%u4 LtAɂkl' xVcZ wd6A] k̝ejMA#Ԇ $jBdSI;C ^I,#҆};oD,0y4Pȭ(X/aUW͹7TrNAm$A.ErPDxMd$\+BQcYkifas t^gSluj#i߹4k!A[3t3;oPT :wa$+CJu68Մ0;U e q$xq/x#O>*<4HuZ:ԀTZ~KK`izVlJװ _iƹ;O4d\w! CZI1b_t.Iwӧof_*߇w+g2"y@Ia,Dr(Vn %J9pP] 6&4KЎtGj2oP2%ϑ(F$'p<$EKZ6_K|gF3J7HO)avxH0r$e0X M P&#feV9qӻi^/H{nEɵyHUy)Fc"`u*Th L z_8mB0ЎtGc.E Sу{06H S!0 a͵$Thc6ЁjS R zkQHˁ)a9꫿:b18-3 (zÑB1$ Lpˀ.7cM\*J$6A4&%{xf]#!Ñ|U ziL# HEF&0FHc Օ ]O4yBit$4^cs%;x3Њynv>o<gZaiҠDq cWaP 3zna@5 u[׫STJ$v0{ )ОE/ꫴSِvms|%:_*㘪?Ld3+ʻ[1x)1;ἀgp GTLGsNX`7 2b?OzZai&}? nFGXOoNiWP /W8P_>Ōb_=Ë`=pid2?\>WyiG<,6˝-saG0y3bO։i6̇V>47^{n[o8f67So\o'0݇t^֙+wH44%-QeZ__>㛉!9irwظi&5g/$ݥ8mՈMlz)-TߗJOK??#X*!(g[1vxGqU*{˩0๰-@F>jT ..+9k35&ݰ 5l@*H*eXJlo&wwFl#Gvè?nSjM)W ĥ{\ 4#iC:ݚ9Zi* ^$aKEۻFGQIćӌSVq*i RJ1XGy;>Zu,Q=uvT uoI͙RRA+VA`v|5W%|b)"+tM)EW 0GAVnځIU[A8#͕8UJBXLډ{Q[eaDRYv%\n: P)%8Y=:g4yMiM_dB\8X2z0\ZBK Ta`P˲r am،l$꠨=PPQ^R5[*;n~JhZlCt tt~sj2&bo نn=4!;kͦkdxX^jxⒶEK1 HB,ٰgf&l-Zr/ 'r=m=X EW 8p{dX :E *l1iLOiZO#3-AM6: _k`|2\cc/YP;3W=G?|7`.:c~5o&u>W Aggm>5r;3E 4H/҆ d 4Œ"/3w1Q*7yQ;G/:^/՝n=Xhw|68c6siһw'Ek9’\Ej%vv%r5(pk4(GE ZMol+d:D%hhUAy87cV% (*L;&QD5%n_~M'S03N FV:)%_Ӫ1.9Ht"\cHt"Thu0N?:񉼕RxIal1 n:1ưVyX{wsʏݳTO5N\jP^Y5N/EOi|8^sPi)e@ {d9U rNq xojCX3(LWga^|&JȩʕkCQZ84!^ ,+ YNspr]RwZ!u'~{Rk-"؎M [^_,yg uו%oBUng; 8 iGƊ*T\}1z绀4V{GҰԊ}Z f3vwY6\Zqy{84V76* xǽ>@Ft>09~`Bc҉U7Ncu["n}lm"{uƭFAB ۿqVmn), +O<|iIͰ LͪO`.J.랸 "YrY9 R3z˪4cnrY˺ViLJ韹]?Ҕ}acR;[O_s\.eβ&b?Nq.gq.gK'y5_xB77uYCs1y+K??Y\hf,"c+"R0*1 +=Ge_1"5v2RPd,)1N6N(ZqG YE J,1Rxއ qzL,JYwHkA@q}%nӕNV~r+B]ow|I}V")Px5#sPt5pl[eA\%!0&pnS8r'-l,uJ vҡͥqiGsԙeD-qЄ{EQxnQ(uLނ!ג-\6*ەkgkn+ aډ[qT#kyFsӌ2PGQ@2AV j1!^|c/paJc9@dmQ{' X@Q͉u2鋠Edmvſ WelF2FY(zgsp(V12Ʋ@S x#s󂙜ajRYH@e#ܠC2`ڗ"W*XI>HxoL<,`,L*Xڣ96yu! jx@%* J42c&) JatE{o `I_.2_Xg,X 厭` KLR A;P E #CM1ZƉr9 C̰YUr'a!3FoƤǙe(0X`U2H[84FY & Wj#"6̵13fTyj V.ջ?ϱB~D,GiS#?Z u(}n{?=,Ū{Xi7Y'WFStR'j͠7׽/Ã=?_LE.Y>v/RZ\? PO! 䧧-|RO JOx8YmjHxVɮy&tZ,0M?{G %`9p ]$ssiwq RW(>"] s2*qM&/&PTAgbUWX7Mq+JUXz4]kӥV`50%6FdTI"*aVY[!SlֺJ!*PZh^%7=uKM_,{kcF~h "!瀓}Ьds~{`VjN+T8B]7WP?JY׎"6I5<易ǢLXY&jV.Q\-w2 !Ԛ1Acomڱ%寍pKխbas+-=[^ٕӧFsv[K~-fnGs)8)fsSoȃSwH _9;܏?o=sz~u '1J¯b U5݂Yf z@jys!HX~Oz5,:+iU~m"^Ǽќ.(@{A 2>jG>ƍR@Zo}]Frr:UF'ƫ4oǛmtBȐ98ǸjW"ŭIÕ#Cu EU>Xp.xuC) I-XoاyGf>*!PqsuV.:ϐD=(0gMYON=#w0/vq_gGVe#q޿ ~ `:z. qr1Ѕ;uƪOC\(xXc ;68#ӼEb3SV9Ԙۃ;{]F./} E"lN's^bmoAM=~ӭ؂2QM_T;z5N~_saׇ[Xll&# "q0)i ;2r5ic-=9wT<;W,Kb48VVX Oĕ IMTԸ7I~waLihs"_KDcp7G4x^(,^QS|8-`q/n-’-Grd!y{CTr6me`.ޯ>Ax \fߍpJE6Q!uoGwyjm\Fc̪ -MoC066ЀMF3g9u<ɫYai=j]QZH.UCg%Ղf9p-4l]on(+P8;p3,ꉻcZ*mR:sI8x tSp tE"m.} ʜQˁWGkN gs9m Օ%tIM4&N 44=[(LN.H.Y}V XzW{K%eeT(Z-Q^E)M6-տy9Nt*[:]vzÃtbNT}|ټ|tW/܆Ģ<9zΤdtODKv Tpǣ 꺲x&7zlCG6alf2Ō" >l@fN6vz)"Q$IJi}:t&)#[f Mϖan9WkT4-t=Ҭt d*V8ӃJ?vqUY0EU2;.&*AE!`dtu:PBFjJ)[m9GU"8ޣ?W7}κՉox2ʤ+4ERZ&йS>zRƛkS~C" ͂ŔTB\jj"RBz9՘#$)PP*{\;l^ߧkHu06Q{LoӘ]X/Fqg&R#?q)$Ftº}Qg V:toWf %^Yr%0.Kj,=GZq1̙%Q];O~_У{b'x3BJՍ- L\'Ez^&bsuGjQ)q\\ wm|'mJEO(}m>IJli*Z 2 mh]-XP*-gMy9#a@^O3U`T#)vVP\Z ?;Ts,Pܹ+$ֳej cW~],PG6P?>?2_ r++g ZϪ_Dl)V?_׭"[5¾ XWD"e:90DKitd3C hwa -DKo~xf;vua\\x}c~t0{ޭ<T8>}Cͳ^iBݸld{B$ٛ/xwÏ=;u $tgm&7$ך&.ۚ$5B;4BP;fiҵtVA8g9v9*c¶] úxkAQJA:Z@bs8V)@䁏{ͥ,b6RN|}(4YNaUFnD!_PM6݅a]o+覠|)O?=A(̇~ao¾ J/Q͌nfTw3&Ϭ6c5De;Uʥsc(,2R×i4>%Pw7-B58a-j>]yw;Qn.XGm%d}3/Jԥ+vŊ<-ܡj>ua~AF͵dt= SVmrKlPˍ.QN2t.5LA~ T$;Urf'\w |BixAKX$i=TݮLʤLŬgr/G=^wSb y3 0E{Zfu h]SL P"fsk>A$Er)BC~#iፖm$-*X F6&qY!t}HxS{qxc6~:Alsee(YC:€+$`|CP+y`fi6n,8æ{{u'3 j/ZdmGr@Qy\o)Ւb%cDʂVDnC%[o K ia2dcOpż,^MŸ`eڱ]I%TF=í%a7[{۷u!>. mv??;ʖϱ3gf1%ώ)~qS?^!^CI݃rmd控C^ 3PcN.j6N2s򧍌w\d:~ԘN`q {^ZRKh>4i<.C@.' Tf1@CV't PT>U^j}-9vz|5َhË㳝ePws?;Փ !q GTnq]cr9Q^vU/eѻ_.}ngy54cIJ˯L[T0MgG|g<3}BN!mv-{z_]b#>H{xu9wļlýcϽ {gxn͸ӮVX=]}vsE~-:6Zlke#iSzkÙ=OH.X9d~Nbtr*O3}$:5!vȪ7P3O s\=8;~۔J)e5sx-cs[!>nyU]F?:=sMbh=oNH݌x5(,0) ۹R̭gJ@/Q>ag4Iq_`;fc NaQ HCMt<,rWMYQ4ʡU rQ%a $aK|ʠTa1gDܔ K>׿]Qj2us!DpKy5%oEZ/3EEIoaCa 5tQO?u#?AGyD}N?Ԯ:GMBg-%&0:܋{WFG`)΀Qu-`N,Kefx 'ؤR QjTJtzN\FAHxJ] rFFF HErqgW;&ͫ4N'B{.?e8}pٮOWZM~RXLڥ/Ŋ+QÆ]*Fzk̭*(q5?J]]ߦGE@gw kcOͨz<{̏!U gŝ g*[^ ky UةH솿Ou^qT9ߨWC^w"{vë<5Gv^CzVZtr^}j y==;7,UV3׌ nji:9wȜ=G挷TGmND_(s}s(3ѱ]T4% 7Um6 + Dǯyvy!}37vrC!ъwط@!{~NQ/U>Hdža|hJBCh*,~s&;Ċ[?zV~JGL\u4ULwh!lS~Rsm3.Ne}ܣZ}cö;8ed02/?6@[Q086xTⶒnHB8悙;4>Fo(k|Q9-)Iw3/oP"^NLxc_h]_jВR;{yyR[Vs$~V,-=(r?@uVYQlޓ|zfM)aWMM&% "*S[*lԀ콏Ik%YZKU8QcH7)}5% RK[rq)U NPc䒱y R$+Mݛ(KDVB؟oɇuf) ?>ݫb [7'߭SH-kpBS%)+4Eb6RMIw9MYQOǫ8Lvb(K=oأt [k+84ZBq%?kLI|-Qg1q=Y, t,-wRRrUR*p~d3w߉taѽwؔ u-NWJI޶=q_sPYYgj- 5կ^KId?HRbjOtQeol`䐔o;$%~748y ᐔ8$%1}lK,f㹉5Tj@8'39H ;5E({99}w&lBl)&d L&vh6-$Ux\YI#Nj. 9ZHu\΄p{[ԝl$ TuD ]aIhQSs"`ѡ#uvȦR|.IŦ@;hV,y]jԓc-qddLV(}(% SNNSOR Eb!58IU7R Dj8w{ܱ]${'GbaRօi]/Uba2ꏵVjZh+bEU"*! scꐛ T[d7.jƨ*F{IK hD~[|xf~3xQI=:Q`vb|xIqCzj_HƜ hNߝX*X}P@8SҊ [BۊGv WCʵW # [N-Tj!9 {Z78;=iuhH!kq_wXz~t h.\f7.Mw8g\|Aު`qZR{{}]dU 'W$zX)izy-/xz6"P,Y?OӴak@Q|̬ٛ&+F@ D49[%j+`=TJ٠aWć,-/GCV\BK7^ ^X:]}15vR@S1Nԣ]I;ŭmIV㲨&,baXPjTs,iLQ^LSc ZR7%%A^}v|2ِhgs:M8 L6g,JJ"цUx-VYîQ|Rx4Q^I XaL,rW+Y uli).CGzVȯ(V9Q=Z3,,}^r^>1t}iGɸ&]{׿HY7gO})5ĜvE,9"n>};=_?R(Ĥ<+Hm٭ r.s.X։k\VW"SWRrP=,r cMn7j{ 4^i|db7<ꣁ裆C6 r$Q]]ԏJ},>A/ 59vٰH  шFk!cc¹Zs TKUccPZK^ @nt^(9+5̠Smowr)i؂5KuQ`( _Yn}-5Z .x Ƅ1wmqэ[*OIU򘪜y$fIMRrwv!i^Fv Π4.j=t^sNO쏯\oJۯ37SH:VSUu߶̤ "MF%ixh%;sŊGYVs}j1#X =H M wd,Mne oGś/vrIO-Q)Z{1X I{[w;!tjJFٶƋx+~c ; `B a <1 lAkF G ro*l+gbF&2OYM1H Dk҈"QO(ZZ=$9Yy7W͗(I\7jc6c%bMfWM10&)j' 4`t@"lUs ח|b%ȝ>طս'hr"Μ=gŘuԵpn=d7㩹yPT9S@Ë<ƣ?LPIɐb !i׺ ϖJ ޣ1%1ʆ~ZXx?sPd;=/ٞJةSp-Pq|/;^v+ظwr?~v+oavOځs/ ڱ1D:+[~dh=hUp ι"*AzUsD穤+[݌~3qF:m{In vȻ:N9zv+۫L{4k-5#`fvb܉5jB9F=qhg.d+4<S[.= 1u)Ȕ]R(T"j qL.@̽R|V+Sݎh0QdΏ;vH x;\_E!;p;7x4ǍqtxE,Ӈ۶R˜OEl4 ([AN8^)uDJs_VeudqoT[! q 罕@wg`a&1^NlYΔϷqtgzUt (wS]3ՀzWS\ &͐5sR%w?bsLe^׺=EmN;Mw=9Gul.1QT Q'Yo{Eƌj/L0gn=To!f>0Ecr3Vk{3nExb~S$hSZ]cZ4sT LBS ѠkXcS,ofp#/#:B {z{E|)9sކ5􊏻kJSPn/cyٸ_#ҚX CJ!w59bNi8̣1>:>DoCH Ϡb8 މF3 ^"rnrTE-GO*h'nإg?\O/>|Qm ͭHhGJmD2tc,?WBB$22g6L!fN%&"`GuчRu .x%}AI7:}|!۹^bQ'-LLj;!QL ƥl\ˡR(Qc j̽u2_.qVv=O!?߽wѝm8-1$7ϻ %Ow w:@;uZɭjcO>\jEa'ilwN:`lFĊݻ7&#{IP=ܡkϽ5ꦔ*OUdfڷ۝ہƀtR4F_7o-zd@=\y1w ,\K{'-a CIbSѬP>B;ۚ{!5]Vj{m$@wgh{c9м |r튑3k2b j)U8b/3Ub碱Ogsrd ޓaKs.DL Sw6,9~Iڙ~#AY^NWvcv&߽߸_zʃ!9 _4{c8Lzz3gTy5C@t{ ][b^f7l'3WGfs`p\qwM^@8{uasfIf&lJް~yƏW$`ZCt!1,sW]d#["s#(: _ĂG\+#b.63`;Tş.b9#b0bqZ뚬ēlսd%1S^x;ρ7k@M'ߧuݶ{2Y$_q]Ԣ"k*=njh+P =ը^qqwȣh`øE5H!H\7OWlcYkv 7BX?u1ndt^<{Va  >TsѸBiF CXn^[ 8=9Euƈ4w'a<\ݯ=C_#v#ׁj+DaF)G#k HB'#h7t'aV{ Zv+;Ay},ЎуTs00hY+o>Oh/ ڙSٻ7n$W}m8wC{ |QF&z߯ؒfZRKӒF3$@l$ȧ*UVaLq&j `uz8b97^6]^nREs DVY7UTsNd1OSaR%z CuD2QRq j`q!-W%j]=Ry%|Y 2w]LKc JըPNuEh YR"I n5ij{i*Z<0"}'>S(G>TE.A9Fr& "=$S%bOe,;䱖VLJd&ʒp*Jl҃J5uټ PUzW eH죻mʋKW*:9wUgt#>[:b>p(@U.O4.8Sߺ0M ܑS@$BYBRq:6= [EH#mRǛu}rJĂ\9$8(!N=OGm9GyV^y@9[.:-:O{e WN`j\Ƶm"^I)8B ;,&.EL"cp)Ξ5UUg=[91Z22z]H#LrLMXMctҪI0jNXPdtmLKeg?mLxZ m;NS0ǹ%L/C.t1U9.RԴ/Zi\cqE# &0(zpV7כ\ssEڝ8jsV)U@ГZ<#"Ln)@D펄)ÙLnXNY[C@QA%~]vƵL,O)vD<@iB2]v/0*Q#vj/Cč@VI&ЃM7,&jwըΉ rNƋtQ3Q  xvGBb9//@\9qtcKԮ߬֊vӵutm55"Iv`NN=!k͵1$#0,(<{BXTU&hJոG1Nh{tNцb@?jA(QigfI]|rJ D5.F=Dl2Dz/u^/C!|˙B"˵/`.ȳA2 #Δ4ZrQhG$Сu UtKDy`}*@d-Ie\Ky۸lC%1%u%UBΪ8~xR[ѫ_+4^/Wf쇯o{tVՒäoh6j4~VGhh?FrVap;NߟŻU>YCݾlu< U_\cjpjn%jzd2lPލ]oPf#=Ɣ8^\OFSJF;-XiFV>Fk_CMZ=,۩]Ww˂5ZokFVx̕|JuEĐXyIͲZU✢bGP5,IKRTlqP)*RQUuTVOVkCՀH&aA?wЇEDH JE_X1M(nJ=[2d/bku|8:Q"^lQsh9"$sHTIǤ"kش5[ޚ>HE OZe7RVVE+Ki)p\sSs ,BsΗ&X1vYiOi;(<߽}c90%3]RjjX5c0L.SHk:>jh\y$nQ]|UHmNAt"9[F`|U-IEod)-h :SGgys@Rjx h͊ZZiKA#RQ܊ yʝvӎ`j;]y6_+PrTm2Jc@ &7x+tO[UGw!l#)3ER(i[ݠKqDwDG^с 'j 1RA jApMMVҁ xũG/R*|T^27Br0kE{B&TϗҶ\o_06P<+ br7!]pޘ()#0VPgvdmA%ҿw114oȱK5Jh 5 G gAiadR~3l|v-q28l9 3e:{tR(YEjI%~l@:7%亴QJ,dR󛟾O89f1J`xCB]q`]?VzQ%~=}۫?zE!PKBA8 eŧ;f9ovcZ0OxSD: T +)DМ*CjgpI hh>M3R^dUl@LYf$G 5JBIXS>v}, G@u(+SbфJExińkaE _E=iE_,{DMPU'rFppgީ}dV\M^<jwyj;Oڐ/Ƴ)/*Dm5_iofVv!7Kj{?v]5#h_Nq8w~8ރمZg1:B8Uc2&LذA ׯnl*Uo~%U c?_ic)ht}؆;ʟzrUR'n I8q椃ƴuP5)fYJ)(cqu\J)fY._RVdf˶d^L#pf׊Zuٮo_J ꌌbCo$ e|j ȃo|>pnӂfaw@|@w*0c2ZJqS)\^dJ nJe>@8w%п:Ӆ*%<\ jPy)m<$qmOp.V^]oRWaw6yJSm}<|E#M!Z}ZWT'^S3N~o?xUsyc/뀱p96.ͮD'<є_;+k7K7&7@!ZmRÝ9s-P`8Oi'm#IW))H0.0ntiyՖe$F)*HJ4 `#3"2#)m--}c%t;D`CډݖOO\84hS$1\קwKjȢ&ѓG :)pvh3$CEKa4Pc(Ae9zFfrͪzC6b3i-~N 45A24Vw7`[(5tՎHjgyǤj+QUϰ>tg(1xucBb68n/jէޡYSn$" >5^g]~9$ y-Pjcw|w^ Z,WC޳"dlS{Vo4*P鴘t<ҍAꉉw$HLIR4 `KQJI+SjB4vIL-ZBw:hiB ɖ^4:X-FAѲik P6>vM> N)0R+/ pd!gFi5Ĭ20!Žh!`J éOѧ$ke-BdXYd,X`rKr$ Zi+y}qHLrEZkQtjLa0Zٔ RR]*YcVxK1'Ĝ͸Zc"|d(j::iБ;_\BH+2FBF AM(]d?f!: k+Q@c0)3G:`.KP=1rЎw|rveUś9FZM%Jr/۳lvP[?0 6h8^ ёM/Rt{ f9>?y*RP=H݅k4酠"v> - ^. ;q1$PSPAdhMftG'|L:N#_%J!~x"|&APn >XRǩEU b!Qأop9C'WZn &כ?tVno]7_> gdm%H\gM[z;dc ٜ m!zNb"FBql'u20s ARψsuxGo2_],zfBw-4G쪄i[̗ h4[̂EzQC$@鱒vt}!YIl&R.0cGhr]WԼ7&oYoQ>T|}^Wùy+N[:9.,<|6};`*vO2zczw%4O޸JѨv)bSiO;nC.nmY/7>Ʊ0dS1 tz|'4=v9~ ]| %:RߜwGV6}ȔwYP?@VuJjT)P(ArNy.>Zޫ&'<w+B7|Z  $z~5 @;MO\2SOS”~fW;q/nR:#͗iЁ݇.~` w.׫xx>h J~KvPI3 ,YF@3 9u٣(9[kEk6|SCUVS1`LasS36F_'&DM@zMCp ёILƨO#n.&R@1]5X$D` 0r 9@&|>aFg) 1xay,kj5KTҖI_-o[T}l6 [/7?(:VcS]\h$4Z!e-xddX#¾%`9hEGEɬ)+ł$C@Ϡv8! $_*$U _ҁ,?R9\<֦H\,@ n]cbj%aoҧCpF }Oi~-Q{?/S572_o޴6Ǎ|>?UsM T@mSq*@UP;A6ovo6uyKYރ{9X,tʙgq\tZⳋYEfJ+5cLi< <}FsjIjrq)w͜뫪2NggPx:4zzO0>o#_3P=l;/61 p!U_+_K⦎WzZsj1zri]탾Jز7tnuM/:Yȳѝ'r~u}3s{~N[ٴ?!^?{.͟f!|}]UH+A[4KuQ'N&mۥWy %$IH94Ne˪HZm[y{I| omu9|Ç}zz:dS?&]dUv st,MѪ_`XY_udk/D1␖xr4#Ӥ6@J=2,-Tґ>}qdԮK}; (:>p-+RgzZZ__dYo?L_}x^w4pf`~V֪&Y'=df[vݟY?iʸ'KeSͤ3 _Va 8`asg>BE^E6~] ͅ_¯ͅgW^4 9ST 1xa%c~bְ^w}^Lk"V->5k7' Ԟ۷ m Ky͉0L0`ttzv?E#Uf<_bפ=ZA2;R9 sIH:HRic ˓)\ ,ē|5`XWӗբD1$c1M6 n!BR$Ar+ý:<ܓeuZ );wY{lS1;( $Ή A<}dY >X"arX!$q6b鵝"$O̘D F0lCltQ4gLUqaEW' E'x!+dv"UNh1!!c'q00Q(u \8ID>yӅY}TlӜSt-rך Q1 "bEɚ*<<{T"V 轴CͲ=\kh1Ec쌎ID"'QgVg oKxEƣ lφ\L LX5,$,II48b5 oc̩!!cm00PB 6z},h ئZ25{<۹32};^esZ}?mzA_q3\ed4~r7sDN/|!# fÎ?ȋ\]ƒtT멢8 oEKRda""wgfggfg~+=LV f;@l1 b!,A`7IOL1zSHI-o" ̵AJipr\Z'vAK fy(İ47ڄZK[?H*io jl+v"' T&D&\cŎ:w|S5Tn| j_ K'__Y`tOM3fLdeHb4MHN(_AUc?{xd񷿿w`xfYr8{Ky4Rl ,胠o«G=&\f@Վn2MtɴQv ;] =#|<wXkeY.X ]0pyDb#G:Qd5cqj0p{\e?&kP .]X : bc< IfY; "ŵ/ 4r0({Ms_ nmcI g-ǥd⑽*@j/0J![цw,Qnj~w_9{#yGAH]yQXO^R;qܸ7j\S5G.iigpG/Xs`0(\b4%;̉Ba1uX%ΚX[R1,ʦYz'hRM4{6bAVg[/ujߧˊ;Vd3}M4Y냳q}7vQM x)+u9j"1qB6&z(uĘo{=mvRtc9.pif"#$ZT gᕜ;ywp'8"+ς2c͝JpNP-P[ [d 5ҥĂRT"SÚP{GxA;|V? cގP[i$ܪTXD_kUmxSi39Xji>'D:iTV2*UMj:$`H< *Rk}DIH;D \*s{7?r!0}&܂_EhpR,Ed(*h;}):xzw2/@`|Ns:,]s{b{2M'/sLיFFiM(IQ iA"m>4Hm/yxq:iz/sgNYqdfI-8Vɝh֫w.g?7?辛o>yٕ[n[Ji.6[3p!=}Ȃ(V8MӤߠj^gpIi1INO }*ӄJ֥$] ~^u5yV8FicG85UWA0,BjTWG,R/@8X;HiyO)(y-*xtρ\t4VR׃ ?|H~F^U{=섦z_97,Jda&mdRe#R.M#,1i N X2E#rM:a2і9mM?+T!U I1[F*v6Bl/JUu&x_YN>}o^llpSvS"] o{;˓Gb P^#2+|>}gN1ʷ _F$/ڪ-8 ?6V`@1 H$9iڜ$tGO̞jHkWH H:sZ> Y>&c'r~mvW[m&@PwprVJ k="T] $5Kc+4\K 181ƖP>)k*\qqnZ{qX"%5^/8xCAWBS |MىۑT:\iˣ2<?`dΣx4'\rܸtjT87HnKZ5I#ifH!3Zm:uZ}}UF `z9kQ  =_%vɀIè"14cE za;V<hi1Z xQ@QJ IpPvzVc` 90I]8/ &%Ⱥp|Q] y6 g0pG0uGsA'XD5UDŽ &rlߥP,q},HCY; % :p bO-MA^a ]ׂaz?@eӧ@O1F r*2a"hD1KKʦB yn_tqD)}8ܘI8| bOznXci[Yt&~2S8O_ɟu VLs7~~)fFtTâk@'&ɅvӳmGz| yjosٹ^N.QPzϑS scz:׋~ }/^*KLtOz7%9k?@)>g>Ayt+㗓hR82Xf׿O~S( }>aX_Mr2{3L L/ezYR\򿜵G63Wgp]E1EXjr>SlXx$HlBX9FwAWk 0Xr`V\0M|}v]c-34gs&y7ie1yA*-?prn! T+ /0:]7 L7i?cS?f&S\.zvL{i̖v#m֮?"^)v}k׷vfBϢABn^smFL4N(F` aJKe]_%7 Җ *o>X!Y R%-*U JuIaP!)M>7p=Q(Vǔ9n ,VZP*Ď%  [cg6}xrq+r 4'jG%6\C*l,pg@vJW|n7Aq.5AV9OOMȊH цcBVJo@-Q!gaHe*qc.MӜwA|c vwQ)^@-PsQhU0A+Z,L1)eO=4iYf^o0oڞʂ7mpƤϕU$eV'š\Vudov2Pp @[[]μpyC5(o:@|AW[fl. f. 9l%WlV!͐$Ͳ1,NOeR7F =h6G"Q8X#j%EFw 0&QĄ."9}>MLv{O'ckb~7CRT#v2 g8sBp97X1b>+74m} ML Rk&~mjTZf*JqvP]?| <_2%/RcJ|KҬ<~|:&rXxky FI! CeiFgj_]E6Y7E,ߡETmpcna]awɏ@F|:p}jɞu~R/WW{}Ov[hwi\&֍yaPU.aL@|hds <F1U;ȣv -I LRb-z&z)nإpKlu,V4:-f,b?w AZ!ixMdtooiǺFs Ҝcx]7(QƓ&JӐV)P{p; `48X/VISmZ%*i n4Is+jTd4]ڼ9RՀ⚲F]j@+0zx]*@*G2uƒbHɷRf5)h.`*$ڝ1* }ZLoDޣӨEMFo_Fgi̔BAU&~qy5Yۙ崹ۻz-(Acp3Vdq102A TqN#=,#My G]Ç?.Wfҳ<^z.9} IP!59:L8~1+1^ޚh@,roͯh[e <:}* a kEV"o텕J{eem3Pf\£13rYKILy.CIǣjq\e.ܮi!i7 [Lt+1-K/~tS:aU╷l]٪h /cG<(0KLJ"_&xz36_"iTӊ^ZP8yxA Ld;K\[8іWe/WxIx3[cbkԬ2{d2{rErW0Lx`>1Qtӟt|h`j"7 Ds`gRHb("C̳_CЫ~|/[ E{h8Mg/\ٞs%h=lI ԈYAMl;N+8g+QP}ks WES\f+Zn9?̚-j3A ɝ4`crS}.k9cpSYٔrl1y[O\Csܾr2(/rFˀ{a49Z:v.!R )͕ΥWqO`JzѐF@uhfC{+idc1 RU Tl@c%eZ]r+G\VkR~0t0ؼ(%KsZs" \rNύLڐ ,gNy Мwuval vw/FPC? kϥ)-QRHʶds:_QN6OA"OTSšzėq̇_vZU<_DR^㿳ǏkleXuN8Z+A[yG f1wH؛X¶*]0}U@ 0a/ ج]u.aU6}qѹPL:pr4:ɂ_نM:U;MmOMdnXo&kLҮcӲ)UJ0y38}tψ,B8zDkc^v_I'jČn;a7#.w'\euj-Yo^KB$m;zç3CT7u1- PQkq旱+PwŝKN BaGu`FuahidwWc;Iy Z&d΄,3Tu;d7DT`wTGN(F /(R/pT t<24 70s\[5I0][ n^Btu$^J>(l=z_l&#e2tsw EgMgͥп:2Թy6J^mEzcxDeVz;KS%4uB|fI"P8+{Б*S?M>vY A<$Y4C+0V ?(Qw*ۏ-YNny<`\r4Rm`ƤVp_@{BPj_@{2 j$b%SJi99c!Q]X' SC' #&5z/[6jm`nXJ򐲄c/p6PWcލ!IXS&X͞;jJ745c?)-ښ#ښX ۚɩ &imx=HIm=F4ޑJIGr'S)DhnN#pM7ּv2PY] 4; %,rIWDsio(mJ[-kmLyGmL{.+ܡ\WPK%Sw`0=:\c%0B!\ hl 9%ۘ7pUXTc|\EGMt+\n:q6>tլ}Ê_}?+'P+M>v#?=:O)>/Ekrt#8zQ,P'D)$y]$9E(y棞SD($N7o!W`c1kd}Nȯ&xmEDm7|>gocXgpF1jNK(F{>]LI7L xbzW#6p_*+;_zV)W8K`JTnN1ɩ1É":+gSS:L8)ӻ7-=8Hya2Itsˤ s+Bqt,ad P]\nHSBepQY:*As~ 7/}w57Cid.=jp+|f M CX9&BXr*3`΂N9RF T 4 nHʵ< k_' %eiJIM;m 5OD3hʼz TؾLeT?\NQ"su.b+7.U~"d]ޖ\hx{4XJ$RQsCʷkwܭ`fޓprZbs' 1dLvifwIi҂Z\rag lT?-( 趪[ʸύN #U99q2,G͠AA`h-+ A-hmZE'Et+B`܃%^NjleUɭ],@pR6ڌ?US2B 2:,!\ky1~=Ň͍Fg;~֊rɞ,X3km0[H[̙ BT8:aBN[M&/4!rO-Zh8 &*t<կx~lHjqB_YYt@89İB?%٤,)[{wIj%{ggp:ۤfw"\gNie%Ad{cͼ+ #"%KuRD7*EA^kS<|kvyU:E^C3U &-Js̉ite1xiZR6@3FqO0@lIдL\@r)4!gɹY CA|gf8Ԉ 0-*QR*R2,T ()5,} w=9Ӄ 2uJڨ5:HhͳԘ2l$,N$ "'J %,gWTD"xsC=aܽsG 3m9faCޯ/JFn;:"'>SQ gkt/V͞WUsQ"t JHF SkX)T /Dr RУR 6?nd9˲ŻC[_+wD$úX͢'Wooc˴eL &ӌ QɤQ#X&_+fA d!%8MB4R(p%bāP  9K ,>hYg|1R9O6T'҈&EhNsZS&egeF(B\JJVDh9A$7K16$I~L9W@sD$XKP%Ke )ߓV|osϽvln?hhf(4(T$y+CqˊdT-*@t̒7#54M(89ٛ]FF1RH{NPu^fB?W٥GM.+t+&3(ӑZ$s )xLJFAA Ȃ0,Q%GnDa A :A aPFv:Lv!(f-l LjgDEyPP!JD--y0B"8\"OU*=fIpr0 m{5kktaNZ-EXpY,Y~3C(+k@ LKHQdE[C&"t%n)RŢoW=^no°$LǦ<+2hHq&ؕT^a%OQ"7h(0 d9QYEr,L&4h%L;^!ݣ槕ÏB`%dusJ2f߻@> <6Not?w#,aE0)O2Z,v-|'4dȴiVB,cبjL9*h`균'$moW> SzZp<ܯfRA2vSռg3j֋]Jt|Q=qi.cрIU3)1dbf/[~8̽/?g뙨AQ_-}#o,GW/NLW??빨=+ȴj#Ard>|Km<)1[N%X?l{ \ͅn_`D!7^a[α{|w.v?2OzZEX2ɹ]2*2HZ+rcOyq;馼OyMok{JчP7?߃%ᘔa:o= L]o;9>kBu*>uoqh1>|CںIk,~:?C|&ԓ߃]f;nH^=Hr1$3N~wzHv9_7: S3-̟ʮTə=91}K̈́>=L8:<7[ӓ{1e鸸10%d,zWt'ӫORDXoqR&sG 8B`lQ ");0/yT(5KfwQq"oE8*ЌMiZ?&qg c {:S8媳lFw ~B]]1G;RvpDP e[$ww\}t Je S2mHNHXP;m 2~ mҋL_w74#~xCN(fIąbVF"oEXMEYk/'Ao$4wnCɫFp" )gRFa^2r lF!InYDێ"qY \ܺj]Toj&+B%'EgڹDLF oB Tjr|4" .cI (܋ 2]v"جv1  _XR>&eF4sEa,3EIؚNOa?s3f{q[5'+{Q|/Z bSF0d0 ab+Q9V鈚B-YImi8 #\h&{c TzLlnm^>FiesHXtdU9U2nӞ. )c}J&WdB t@Dg"F`%@D3:aLHhR(QAEz<DKd 1+ EL (*>dwE5rYw9FbX堺[.Q U7s6N:{^_"w,LX'w @sqs$L%e@]zհ02Y+18z@mሔrvD4))%9*dOH#"5o7L+ץi 1qQ%M%8Aj%ZB#i(>z)VeiOv9FCsDh%,I57A$φC]n;640;֏ V?>&8{' vG)tg1 nA/QH nfŁyO%si lΩ4$'6}*JcE#l 0á6tF95)"R!d!ȬTi#R"3c.y\(&ڠ$6@@hh'υLxP(C Py$J͝d AĄki%}--8mѼpy'k ߖgZm{YҴ^l>X'ioO/G2Y$}UʢS%{3 [u-xі/Z߱ |yH_oK/BN}igDYP| ٫X6oyMqȻ8㦚oWܣ(taVƑ/\D[dsxKݔ^vkA贾v;(`2֌hvCBpm)qj7[+ jD}ۜQ5v:֌hvCBpm) 3.=á\PdrFaZ$0&LREJXgtI.d`^7Xԥ:&NYO^;'$@ecp\a`R= o`ZiP\ZxꉧcۿJ7|""S[]ZiP#:f;vkFV!!_6ɔ{;M[+ jD}ۜQ6ckfDk`d8ۋ4֓ⴂkt+w]n.bK'MX5Q;])5 M|Ĉю<#Wm~oy=u)B@^*GFC| έkV CFmC^]-@y:^ ,G7wfBa%l0K1Y=%%dI!?G`I0L(SB~S4 Ab[:yR7tc(a_yݛ3QOII㳤Ea%,מtgMȔq jR#Ee$}5y#=lA%6Tacx_,'2Ib8HNeXPrg>1f^ŽJofRvFi4c3{2Ovn(y3Zvd;]PjљӒRnr}rI pn֤7S \Y7vᷴ*\gSb{RK)ۛ+ɰ?gT/#]meq X1%IL+#=alxx~u|f ' b: TŖ=#*CX6Cρ$5UŜxڌ"PqVM}C~ gMV -a$pⱍ>?xY p$}Ѩ1e^eYo:\GWߵ' EM֛~\~Z=Upn8jrCk0=dֶ;w3ٻ6+aWmmʖRYv!b)"&A.Jq3EKE=~Ǐ9M4˵]&*+LG"1ˆ\xy+ur٪N?&Na2ikG{־х_le-ɶ\&rY-hp۵NKӔ@UH a%BH3SbIq=!*L痴w}y65kwr1_%KYUX4ߚ1ct4q& +? \[t+pn.˷eVF:aKMcʬ,V}֬O9`8 `AD fEv"Xdljpgd$-[%`ꐀ#v\MOJT &_hIpA5+Yd$#Un_f:$ .&6R$A耉֛ ujA.aM\!HCL4_- }BCplSLY; fS/}Z"ܰC_BZRt؏nζ^f +\ k׿F,u&ayr fUkU*=%Y=):?&9+HMgvfgHB~.MДCy-7f&5v٘F* SKUkQR&b85#kH eN\yB3.#j#;j-t D('>“K |A+NV˹èS w""fM Gb\"p+]]`0M8nk@"#eĞT"ݦ[lb~"H¥14Tb5IW R"ӨbI8zJHsHhoOEv3:4C]Kt!XOdKV rBE6ze Ap4RH(u٤\!VZq<;:rPI)5R(NReȨ7Ԁ'**3i _F8w !6I%7FAaz !Zσ!694dճ- ,G,Raq%2Z *Rc% 4$0 hCE`RҮ ֋d. DչjE@ ฅzՓ\$K1IY"KPHRB1AGڮ35-yE`%S fĭFj@ @? f R];(Ê3i拧vg{s vԅ9.ԙ98g|_wSJҞ@ ӫծ(&%Arjv)lڥ\u.U kM %"XjcU)vf ☤$BbXq۱$mbx.[<)kV5gzqj;KdHAT!ДuT9IrS$g^AvJ6 ݏRpƣ,0>&UpuqP jVEQFj?KZZ3K[|&q|L 'A)1xlP5lMu6Iv;*e*^XBzcM?1GAqXtޏ x`ũmjctxzFm * DK6AF㙥HJig"8D-H8I'<[uدDDt}oO'a_p7,5k|uLTwœz1d?WNo\|C`qa|FXK;~`;wס) ?d1ZQfdԄ~L+Gь?J9 J`+:!-?şR ?90tfC 0t_C{@-D̔1K=6\A`6fQQ Z}[>@ p[_&nj岷3 Q~ZeCӒCs}) qΌ իft/N׵.rzc3.U9E?f/ҽDMNvՖ&g`#=.?dPC/W%: gΙ/d-_ 鹽Lz1ȭZse`> `h0lIB$*bMhE%+Z¬<6PO `VGS Ft>RNxurhɉNUZǷiǷ0:;u|klMphR$$㫛ˁv|bF6|s)~:-l٫U0דO_> /H]۬!C2?bߓOs伵|A>Z#|^j N! 2) Ӣ52]c1nw'4ky@$,wcj\MXU54\2ۏc1ЪղGeƢM7;*F)?o6%>,klMRL_T1Iݥ|JUtr,`32ODأ$P;P3կ2G>fr6eO*D@ /<^p垝wyaYP EV~!0H ,t"))N"B.|R$ƚI7E;BPZɾ#*D,iШ,+"$0^= *DR|٩FYB=+nPkL%͟,ւ@aڞ,dZ4{{ZhdH2$@0FCs^_gOM(sT2fͩ"GUPk LS.~b૨yB9Q x+1>L-GF!-skxd|!p{2?Uׂ/nЋW(:"81kNn^HۭQo$P9pTJ-۽a)u\g-abLMpSo͚Ι5my~k 'n1ɫv/~ܾPT>z)vKSBUр{3Ȁ!\ =2pmƬ)!,zn6gM'M_+8ZDeH쌨[Hhlwj#5[1)ldymM7Y 3}ԦkAE8"Jb-NH<B MvufMՃ\V썫OS^CIyn&no:#h6mcTRizLԓN%L= ee[4I "J$98888Ȭ9Y -ϡ&z⢧ѫb:qv$庴?f"[cy7/pqд͊?vr1=ӆe7,gzB?`iӆ홶 !6)z{Cj#j#:M˨ݎ(nvkCB>sm$SB>vڭ)N2hcH![ڐ\D,SlتYy*?:m֍``\ÆRJzqIH8פ?JWJ$F^n(.HtGi.ElVh/ ?dTkeu%4&Zp_+f,.)~x}XdKř"2һdkA9SMZ7uZ}ܟ(max%yP;._4i6fzf}zir \ٸ|{RpS>t ^]5YmMx}/u A}yOw$ފsgG E;p/òO|G<] ~rymfUNE4/3+s M;3Lf;29;{.ƖZKC4 [cFPCh0D[3CAJ"ap)<ˣ/ʷlj'Tzd^z}8pQ<x*8(Mָ)|;I%E|Ks6/J~GL5ӯ#J1|[1>1K^Kw*@Kl5Sϖ]w՗i& `XC,v͕$RH JNFWg ]YE <u0W>Y$Ft|d- Ɇ|duവ/ JIÇ\[:dRC 3)ˆT=pp h?B(LJ}iREu%[-YP]*Zt(hP=R"nzV Ipuav@@Ω|X|H3?vip5BC̙2!fJ***IR6;k&2np'lO>G`5W'c'EZ6p^o~K[dDh8{5\δ'fbrjh{>[Yd?h^H1e<)^ƅ5O1۔HI+ Rw!%+Am;C:n} UG*݉T**3$6Oc(k#\2 vWVdI'2!~`Yp0FzӋoRwNWlLl!j|S;bޅaɺtikq>P- 3]Ϩof3ɰ^٬ײW}hfNFb q,>Il-P-<]$DwGvXwA`kNi82!Rkˤ!}][&ua`2E!X&u(Xɋt -%"J]3։7ï|8D>;n.{g'3ysRl$kUXwS: ,-t4U7H "Tڥ&[':C1a` mH,I,☪Jjvܝ->--GD*Y0!bOa#,(D!eA }y!YHf,^[P|P?E bƊm B:N5ee~̽@\ּ,d, \n -0`ƂW<2x$WfQF&K# irre-IPS!M]S4IR"ejѡtPz!*$~!L0IqIZ$I-$>&Tlr J)H&eI(qlc-4)q)!-)vZpfYJbĤ 2\6L^ j]J~%AZ r%VB#[ n VQ{wY%γi0S)SP苿]c~Ck[YsΆhLwCb7 ɝ]} E}Y¿e{|Ͽq+'* M>|nqF/f>K2z G{нQhhV[Mtͦs}R,\!tF A|NA 9I!TX 3zّ݃'f.xXb:`JlGL+û}pfwW.FPiQIL^^]w}G*t%`3]uAq١@KE{y_{Y3-ywFԶ.UvY3d;5:8~fݺP[_ZδFR^twyK?vYSKXa7`xH;@(rodf$1`iiI`17¡]\8rhi:D[:EFhuWPѡ(kV82>GP)^wc#mSh Z)~v&mGٽ)XϤ3VZr)֔r)"PHԓK!k!R+R'x8)ԞvD8c}``/&U8CLMqn(}t*bV= O.m+ ԹXj r;CB?(i6f ~_^}ufKMC6MƧdߏokAcH =}[9vl՘ $8;;J-.409y ~ry:ufULyp"E|k΅+0qOt9!h%{C[j-U %(lv}ChNMJe-FB)c1(VI$ͦ ^,(߲Ed^z\Q<8IE(~Mָ)|Nd ].[˷@ǞE骟>bp`"SgO"C4.?6̯Վ-dP?z|޸E1ۢw?ϓ^l|vw.AKl5s,#ned<:볆y^π ɹYJ ^Ybc"]RDa.0P9OR׭cҘ lHuRTcLb`5=ӷdy}~S{9*d4ޔCC'RX-i ,2#*~Ȗ,i\u`h?fh4P|[;H1e 4ˮEj%Kc 88X-@1M.k5ingCbv(~j"<*ެ2lk)QU9>V4!`n]$A*X[zkme[7RX)n aEZT`[["!iXraEZpYZrLwc:$IW`+ mB>%9-1r8):daO=\#J o] ֬ԂaF!5*Œ)Э 3PֆH=@ <bH[0Ԃ&<5!Jh-lbPʥs8%^-׏>SX$.fƚX JHE&ɔq̅$ >7|:a5r gZ,7/s3Qowlkfvň(T<(Y:ҌkJ^E]t>< {`${oS rwgcNal]2ϯNgAIx,' :cLG^tjZm_];*n=?e/X)qC:P+?(_eGY]m8nDX煉Gs3-\~~ۿ~ROQ\>FtﷴcǮ0kG^:噧ބ~nviPĩ#茳aj,2uzdum_MoO݉W`D,Jx="2hw&ecX9{{vZ98r|oKlNI]^QbD5`KO?7E0:v:i̳NDz%(+U&ػnT=Pt;ު0Q wh 3يʘLnrIV{j<+xv'GP mj682]Fi3E_z))tq1Ghw V 4 ܸDaa0ǎpjR5eҫIQӔrGFN=K"AI r{{o*ZH vu*(CD8] c. ECD1f6wjjr3?Q^ߺq@pU .iwv*n<0H}z Y<]NT[I$!C @]Ƶ;RH0=sZO=b"8R9ˁK ! Fi@L1gce!⛭hrlps ,TdXdd!%T')Nˊ)͕JnɖUWIT=Tן0F `SBHlN{SW>D_m|]gQQPzP=F1NjVQwDqw`W)x K6r;*󽯳Tp%FΕZs H"U:x '- <6֕k ЄrG9瓺(B19VyJ813-p1Ǐ/y}}F.@ XtXlRH\n`l(|*fMŔUk |AnMU<͉AID]n]tf1t c[YƘ0ִig@$@ݦ//~|GLJj:cS+u1%aNv1 ӇeRG mkQ\]+z!9L(UPmY&s)j#0p uf|mJ=T<;}d zfEP :^ c_r`%EKBw[s3çX꽍֝ȡͫm2jR14usP,.,0S%!b>&nߘeI9]=8}t5+'èFB䪛jϥǚII$Sw>ZُsdVYBaWo9BImtZy,$-8jwM 00J=ZClΜ Jww}hjuVn{V֙SAs<9gi{G^ r9;Yy8y_z'K>pޝF! b ǎ7{D(_8j5zolAґ%|D]ﱑv&5OK:DP6˄g \]l9_ <*a{*(ClLPc9CLҵ@9Ō^M##GE 〯N6DI nkxK 'N9S0<K v5V#HCr .NzE\$-Wt9V[lB"3-6bijfzAys3k; 8E}Ka9bdaUwe@z@'ɤ}u0EvO1.˵W"8TRpdpT c9Q_{A .z]hPSpbQ<;U@xj89!!YfOݩRFhba,)t𾍔yU]P¨ G70il{Ѐ%㜗q!{c>]_>nmAm3 Y|_D:¾xƺ}¯EjUIe1Yա~zyzʧW?4sNRC_>6Xs??m!"@yK>Qux)h6d›~x3yQVϫ$Q63x fz W?=Y|;8w(`L(-N0x*>~"(%k^J^G$Bq^ ɋ_?s'o)A: ^|jK30= 07YWWy2hH4=a1iB;o3sg'H/$@dB@t;'Ė"5C$V7_K"*Ȳ?Lgw~^΂upMgψWJ45p{ғIJ>r>z^jهCpp_?}@%Ձmg;OFMPX_z~w*-ZU7k抌6rjȶ\lx%\ HqK݆3Z6uf.U+;.p;νë@@XA#ea$, K:Z_d!XMI Y~j$J铎$H"=_ P]'\_=nn6ghW!<4AcHD瓈yJA|@7xsN?]{Z"Smi^VeI$ a!8xpZIi.0 2&JMıAϙ8ٚWZq\lf[Ppmd` #!c%B4A= G:9R= n1ecDm-iC஌U0 zol:|ޝR8v߂#)qäWgN:jr4T(~Ô\40GkpDS@&˛:PoQy`BAk)I_I x~?)We $^:-Ċ;Cl AN]ĸχ7KΆg2L 0n"P<&XeBNmS`qX+E?#:}459l񕑑R(u1+5ia7CgtpzĝOfO8W{=h=fn!(P~wԌ@bQ4"#9 U,*+$ȋj@<Dk_C.b 0s^dkWAˡ@ BwK, jXnvb$01'?}ɣB9[N,\wj)DKթٗ8y<8A?"cσ 0;ۛ\?5~;5&< G]xKT|3)]^3$zO0T"A6<]/CQZyT/3M^ \4ӌڕ(|޿cɎ2krƢ9UbICiEf}H*[ zPNԨ%Qml˚.VQ4nAeFJڵf =7轻mz+#TaxTa*ov)Gl"C#8px-Ia-,[o=$c`\W}FIbvudJCΔ,z#~_ΟyRkGe;u|L:e艦:Ѩ!x:vfN۞:uy0!4Tߩ9HCki5[.#Ack$׊!Mt[MRͯi9yՃcW7HZ+ԒmE 5JmlZ- :ܰ qJ$H+X6yHr.uߝG_g Leg͝eм/ o{8xrS̞1J")l#̕Ì1bVS戢Tq""HL? U`(fʐGXƱLLbP1hXoʆEVCPTLk3PmB3X-Ak ED 28̱q;~~~O9% ߄i5^og |?7/KT0+ruB?;㴙 `6yt0ǧ>ADS3z#{ߋx2fi>H播p਼˓Ό.} &z$0፿ ̺yr6dA@OyZ@8Nz .zһ~xK٣ eDgxT j*75^:66pсm!fn/6pӐ3fm7Z.1%ٚZM{t^G.բ>|xh ..fע&Ӳ$7.eВ96UŬE)|(W^lMfܾXޖ%g*u3 ũH\6CoC|F{/UTsޚn3VJD˜\~L}H 'éIZn屚X^4⻼j_Iy"]e,.?u'l4C^WdNB!( XxP4FRPr z4j=Gϫu`Q-Vf &vf*-U'cNx}bβ)"JXX@~hy?tq#G7TI) O hw72d<ߍL7mky@CF 4$ Qc1#hW;X$Jlai Mi *Җa#`YL8uƆpRJ~FuJ9E;#Wͤ4f7=z >=VVStӪs80<Ж@FDSZc0ӰQ$Ә¸# ̅13!E7$ad CPuSNň4& wK~Q-8SgK%*)"\/"لb.Z$nv4ec\3W?dFcܜFt&R₃dB1u0B|]St%6kx]!NoYj>;cj-OM^dwwFvo98Z7_lB}hfo^mZ@RDƥ;F^j_,VBp"n:tu58g\QSڔ4T͂mvťh?]N-$ZS:2AV j1!k6F1%cKcbEDL" !GɜHC(?Q0jGtgT7O":e^/mk2+y9q2jr|'%P՛4:5{Ipyd{i"3";1ƸFnlw;|~W>_g+}lL|}̃v|Ѝӏ`}'#ǹO;L@.ߗ՗^Knu%=T{ޝ̢k!}NqLzc%[mom:ckI>n{h9ZUfN+M2AA<{6o%[ ^p7FFƋ^s/q O|M"j8N&pېpa]zML)xڧ]04ո&5?b&%Q`C3?mp [Ê8)u'U! u9J>@"|q[M_n_.fď̺YF"ۿ]qc+ g- 4qqPbxvHk}`$Ŷ,u")sĠv؝ fw>Z$ŨRQ0BƸ?"/T -AR10To L)&- zt1 GSPeiWҧ loy95JsX\.氵ݱ2 e1;= foasXQ-'Vkv"0R6L'!:Ke lYjl{l5#WcYJ^1G/ophiYnܸ͞ɻnְv5ƤvX: a\3 _7HJ]3HX2)E^>٫)|,VfGE!6, 1L5-##Ax(!WQY)pNˆh%K+3@R9uLQ3}!pRRt(E$JoJЉD viv0lQV 5D<Ŋ !b&D H\;dEL4eyZ(z|V6As.x OC%#L Qy'!L5beJ7,}fԌi^~3jM)j4ø&jn$IFsJ>ִ׭M/hY}Uӷ{sv̨m)@3LK$`|yϽ0N6GoGC}|w@/ 6GE'dӪB}xY΃O~&P,ᔋ>!f=1=kd&M6_ظ&'\2J# <$|^CR yH.H< :ئ\tז:!בW2PYG(@\41KBRtHDPtl.;~7*8QWFqY,ߓ#R.Tt%m}&KPb)).ҽ k* ՠ&VpΆFERDWm3Ah z["#Q|A\-ۄq7bp E߇1XQ?mw;Vcfj!W]~蝜lHSZFTޜ\;Ÿ`{v{4W!R7d,> G 7b%O>q`7(|Va̻@ES`qH3N3)m#'!'HOb]"GH1Q(62DFR˄5")[@E)PN!G+Gex,1cbRa%42!vZGL:F,ҘhE:&#c%l{$baԌ$ʽp$D.~(:a29Mts:E#)Y.OAXB3{.S;^=-?])yu32ϯT@:s*P6%ۺuBKhM5xx#Sh Q^L(rɩ:1g!-Q.}rz:'a!_lSm-=uwwLt•"G [ol\{HmIQ;a;p#B )m8<6˯:?>||:\waDp y:,ݚhid̋E^S*d T% Hu9Qy(łe};BF!CkSN$ّ5 `ѻAKMF1pg$hTǧSkqTi|\c̿Bn>E LFy~1zb󻛇{ )`%d-.D< F,N&N-8TIXbe,:uoÅB0X:^p(,Bkŵ,7[7n p&Ujvt&ǚ>i{Lm_۷/$I o5d HC-fBHwX]$?B4sjj5d eM F-RzcE`y6M03p$ͦ?lpf2j8x);| ݑ aֽ"X2&LS)poRVn5kn3Q-u7h-v^= /aGAqP2T9ɱ[X3XKeVSv!vGldJBu 1hdͱ?( nͩHvAIdϳcKpݵ E׼8\׵|?#었Ő_zj|mMH}v}>I>OI=pV:eʅ#uJEWr] DDFK)OW):X |)e{ XucGqMS LtZ`ɶa9 IQ} hS,i2;2z/R6[|%cNJ]6&o?=Z&)a HA|C$[BhƷkߒ5Y8U;2o^=8=ZeW?>|K0څ|>3GDϜM섮N;Z~M"Pj- 1Պd,m (* @R Vic%:QXa>rBYg-X,nz"[:ǜ[xoLn!Ŗnk!l Z\ hRZA*NPKtɖ-VQeJNGU T1k%5i)E3DTgP{XX-@ 9R^E c*)ܪWJЊ^;FWg3܌wl꬛@ FfYI:= b 1`M8ԭEX3$  Hs1 -E1*`SW$99`D$=TdF_}0_/f~ٱ<ҷ'|\07#OzJ徦2P5^ W_Ωٿ{z>㖫1&矀;sQN黧EWg]8pY/{߆tCD9՟0ԅ.ҿ 5nN&Mþ==i_8Jb m_/jn @KHK^օce|Ko%< 9GJ.w٘iݔ'd5qܵ}eЦ7 rly"9P ]%kP@(9ty)Q+S[( 03]W $B6>;g[vS~ qIO=ţQ=CQ tXmt(n[phUD |UN)-NagG0uJ~koN x΄hڗq@aN2GA$s^"H0Z##w "QHjNRrQI -ye" D^yϝSJh(u)Sw ;s=7{enwX cvK?5Tуr}T^NO$>[_|Fϳ+1̎Oǧٟ]vxpx)\wqkdqcNMF 8ZD>Xx}nCQ>ri9%3PzkΤ*񞷥ջ~:{Vڡii[ǘr=&[RplœUvh7/?Mn!s4X`Rd߻D"`łS%GoPEAvq;|jH+J4D7NΥ㏱T!_X%9j+dV2Jʐ.T,GE3Hr;v5C]WS\8?DX̵@!bV @8H.  dR2GuC!"!(~pk"0  C 懨?;ۇ ?Dre7zj\UܷY*A n4Fn_;oAT%q :~KyfJ]xd.Jzz{]x Q$ZnMR_DD+/%NM d3?|LH!;#KHYߕm PS\1nXOAHs3b*}rוk{`e~6۴^531y[ }*ID49_R&%Zr_ B`c$E)D9f`bw>1w||Eu^~\:LOߔͷs5.Xܔ7/$IW:BKʟM=F z2dYO@ YԣԄUƎ[gө7si$oo/7еdIp3j5I4){mI @n Wy踴 z, Ƙ*,, uרa yPR{K Io[ +7;oAE ~M zcALfl^ގg9 Cdv "NrX|D:<z\`1OflBdyj6^<VS԰[]8͚@88o^u8PA{)n .GvV/hfr E@lXD ,4"+-d ZURDAɐR%z@Fv8mL i @#5* JYBNR•`B"C'1aVl4%FWİ{|g40f"I onNm/LoV72\B5L!%X"#7_`ոc$FH<)@@dN 2P@6PR_q2@T2eD >C" ρC- cn%amv8 KUz7jZ֛04)9+3R)bֈ#kpr@ՠ-D:+Ugn_u1&+G&(ްީ`Z!NJl+Ĺ 5,p 1~H_Iv~XbLo%}<(E#FRp0 &. M)5+K[ҔtOJK}&MJm鎺*gK`ᶋ0+cE!іCJKIQB z˷k9fCׇȅ:5rj:n@]elL$cB;CScDRJf;ځa5GJPhP!QN)%3(iZ-X=sy[>H[q` J.$VXAMߒcn1iQ-l_tZe޽]dCH9U#,9 0 Ĉep]PL 2"t/#^p8~a:[;48Q[T TL +!%DjR8's#n%dTioC_1T0!Q>~.C 54@jMB#!vMFԻ~e&ƍn1+\ m}΃l$к("pԖX Uh()QnhSn)UzCBZ3XANb[`څ@VkmEST/pN`F0+NҍϢ$۴.6rȐ2řufY˰f(A&* BG|'Qᠧz[.5Kͳm. gHcNiVq.a+XRmnreVk$_) ?vP~2^yApSĎ-/JYJ $_(R|Zy^tC=6Ͽҿl_MaT(==.*N'Iڋ4F\ -肂 4B)6fq8_Sns}Gc: 8AEyD_O/e'F͋2p$^HD :,|A򰏪xOxOg~ }m^S1^##*dw-8ޓ4Dj~ *Tgt4 APtЎGPr$bˑk$tf B޺g˷TjAڜ4Svk,Sݧ1'>h0~!&u|Q,OŸ(CYŷ%|Ps&~j[V׋XY"y}ixNTc (7Z*0K{@|B_>ՙի"p+@ƭvrvGjc[-|?Y?5|ucp(& ݬVkBݽӢ!eW7;m%&EpX!t=>KS϶NSCKI8x;ziլy~l6aeś0$ESo_i)uSqQohDRSSQn)WDvk R{N4%~^?-W-|ӄ5`65C,M>zC#l6(ۻ,P>x>6; 3]TB% (23 -TBKaZILw-8g)L'Ɍ}RYz;0>緝ϵ<@ѮYmE^thbw#Gvev)ѵԭp'%wg_^].mJPF-12D͔ sWSD̋M9m&^y(hx/#u2SءԜ;v!]FFi' hh?~ CZUzuoWs(RPj WP)Q=R:K:pܲDU-/5Gٻ}&eTI+O㜴>҄0M|"czIۧN :(sxATE=VS)*Gj0ѶXȨI9(dGjp"ZXoΖZݜ%l@C7r#%nQq_\ߟdRd93Fi$iy3 ^҅vR9M4M{I."*7ZX>ҩBq:լW:g&TVGXT=>RvOPDS`X*qK*aWfnx!|nc`D(0HEDSIoZ*(v4 ahCs` FUn_nBMlH75D--\UhԬQB4 eo( X+6[4 F(sd`B7LOzߡ6/p] 'q4#pqBп/\]:d\ü^V>Λ?8o֛vq4>/z,VB g[|(AɸXIB}tm꒟?-Ww0Z6|j]9依/o|5wWfccoG>LDEOӺy;7^1L{g_ T?v}?o?vuhsާ9Ns{3l>8؁u֛Õsy.nQl A1() Qb5҆ZbMh:'HHa2o+vf"=)tmmyޛnϵu{mB4ff L9&HSXӔq`Ԅڂ[s⿥X:p?=ڸ.hL4H09Xo^p6Xo^DEԝL\o^D"ŀy`y9`o^bs821XAfh~z~dl~d~,\j_,_,_,%_,pBT&kPa#IʔLe K163<2fU3-hx⧇{^__pϋ+B+qb#_D*ƺӵ}_юD~;xє6?/7f _0+>4r3ư!,e3b}p;<2S?–2 EijD*-(!,Ni#ol*rM /t {z.ԓ(_|3Qs [y2pFpAIY2B4MR 3İ? i"Eq9#Q)%ћdghJ=.j;E&z^Z v8߀W쎔oZ w+}7þ7ɎO͇˕YKmÏeMZ l ;GE}[A{-2ݴ3Vl gU_ZPԜIY+yY{xԟְˮ7kYӒQއ g \dG@NpdlywC5RI1H )[ClY ze%ՓB?.`/qcwѪ!j ,kDAݭXD^kj/vu:4zOCdK'//tlY(XFbԗ Ƹ?{ pV^~t=˃C{?icp)E,Pfb=fMlW^ ~RJԯм o~a`WRT8ӯMjw- *3vsϴ$OvFeTDPy €?&U%o܋i5ٞl<Y4qRz](# &U*W{ug>wԙ&i"6چ`˦r&}K.bѴ>$EF =;{%uc?7)J!s&EpL䂪Eg"w.$&|S5*hb W +:D2ЪCV Wbzg55g]%׭YҮ"uϐ,u߾%0ɾݘBOZ9oe3O*K5}%VW6akbr=2g+ݴXl[WC|<_wӥ[o?rM<s%Fo gLp7נ?!۩}X)rc02pUh7P 2;F"M<ň|nvAv/ 4-2iZw.BpYԋ|v#HnaP.ojn5t? ExU85kzkB^&Z9@9A%A@Zs>ˎA%A͚5VA%R;[yxֲPusBHwt4՘=c~OPI&f c.HPI`n4fJyc@PIvfZ11VB1sZ1ƘsITŘz11栒 _+=ƘsXIИ/,0cc9$Ib̂2>ƘsPI/,Dcc!$A{cJ1<ƘJBqٛD11R$Һ1fIcca%Ғ+=ƘsPIX/,؟k1zcV]cC1K]["ƘQt11搒jތc5c* /Ƭ%cy1pc1<ƘCJF/Ƭ fcy1Bqbbccޒ<[f-[1<sSw1"zGcy1朧xb0kcc%HQu_nK `Y3 mba|u 4Y.4dj06bOokl]v[~ǟ>_a4_jO}=+^L|;+9k8WJ\u)Opse `?>|?<+?H|!8fF̘JLx`l0=v:V uV4u,_ &}8glRjsfHQpGsؼ ^ gɵUwݸVVjHnު;Mj;}e6>F>qN{\!KD$J&P$%ྦྷI8G`[ 5ŝJeQƝQHLM<=0<oc9//md9W'|\0Opf&̌0Y&l>^ܘ<}+2o؏=x#N0F[CqHcM-\*K-8cdVdDIJI96>"N3D2+#kHFGAb HaP^76LL2eB*!t8IRRSlX-G5D:bA9Z+Sc)CZOf,K Ό 4QN䌖&D p0#)b/=!,$;W {X"%ҦJ prQ LR䩈23JNC ;18@↢,UjVI[4Xj6ݯUL*U L|<&=?& ),f u& @k`AiW;#1aeJ0a#a i-s{m$ L w{TB7[d o&6|HKLtz>*-\c.LB9 O Jʖ.}ti>3JrGT'YOq*fQ*}kөzoT{sU-ߐ7>?0 >aڙ9?!>1!#{ë^N|{; OSܯ:ֿ32?\/x mKjxvȁlϼG3 i+&VghT=tg tJ;6ycJ0O QZ#Fް~jLr 02']f4t@|@S#v"H}>d5!Sx8쥴0++m\[5\5#(]=pXuٝL‡`jGa9\}a^:K*-Gٜ֔)W"AS!A33W!,BFp'a7ڵwqIGIaQ)6`a ;JAyVwހLB amXibiϵ,hFh[2s_ l=2|Sod#ᥔd4Ɲyϡ @RB,'*\\pk$1T9)qQ@ģաD= lP\˨VD6"%^ pMՁge~6~  ɧ~I<%Y˘6b1h! +` -c.EÆ+`E!`2U&XD1iصh b(mOK,L>WZnJ0J0p ;, YP,!$SQ"Y Fp&JHveQHTRư`IkI&prc~U}YVgٷ+Bm=6jC3P -#x里fY6`%x}l.31 ;-py~r){Px,hZTl?3y}qܛ/#{H|{; nNd]`ҮՀ/)e0x.p"dgir"5V^OZ.DyòSDD̚ D $Vqw n x@(ӄȺsfuJ*Z=IS)«<{/Nz\_/z@ kd;qt(:hHw 3J[΢E<[5bJq:hM;<\p-)mKvn8:ڤMw4MR@OiC^8nS#۞їu~jF_6D&b^d)bM5ZAJdYF #6GF9G:Lj̕ڣzzIf湛z=fHOT[XB~O1fv 4} R*\=tUz{0f:~_Uyٽ,f}e(|/D $%-)w"HO9ij헰ɧ:U(g@ Ñk妻Nަ)Tkk8 nR~J46MOg|8-ȵMifNNO|v44v߮>^,;g|$aKi_XJS*3o1LS4LGof&E#}D3 r_5#+-Dv4g5U=a!8JXvG򋐲(=䑰ւ I{#q H4-?j}Y4ۥQwHڹRw;?Y J`2{wH)(d<*E9\kK6.QK3g2ѻ0 !{?%hu<^Ԛm9M/2&+ܡ2y^ и6 I)GͭN6pa0cI#J4YJ9uZ<ƭ)O4)5CiL), ?;aZ/kSjJI4򷡿s'`$α77l<WUU5eMN + .2T'RSV1쭗΂,`xMtP ;IÝ~\5CUg*bDtiޘM h3e(/zKNG=tS Jf+*eQ$d@)AvO M5$$Uo/A@jv%;HpB2ȟNjI?y~1'&,Ҁ&DMmpzݝp2ZքCuxr6ZڽqN;/ctfs[(#f ԬN])wJɋٟ liJ} ջMzq_QJģ"47F@d# A$#24;)uZz%80> *NUu5qHo@b:{${PBV;ov.ֽ5onGeӛC~uٰs! )_῍ b۾PJkwJa5Y6ݔuhWR\J|}.JJF}3|ߊM#ĵ@!:kל?[?IS\)}iY]NQ {OXM:N,J?#IϕTZH7Q;QRgctUq託wIgV﷟޵-jkG=5\w_F_D7 iz: $y8kwp" E0{( *'9KP_*h"H TPb=!QݻV޵җpGSk,wt(S)/ hќvaY 2UZ8R)Q(qGY\;S"]hJjF&T$du FwW+A EZSN}x(hhτYdaʛ[thKARcxJ"t9~+3lk0}Hvi.]l!:k~rDzV.\ZvEʢ#8~TCMeQz9q$윰 sB"pb d;2Jj; 錥{m F{l  nr:G7r%/!D~VbvΝ0\$aHp. %h$OqaDXXd0g 8x SNWg0c$7Wš^)N/ciև]JD.EY)uѺK|֌Hӽ-U&* s/opOm܏*xx=0rkUK_dOI S,kL$+M)I^MӔ'r!'Y{”*ǚ;CĞ!ѱ&1;#\2Z {)ka0k `AҏU * 쏱ϻR%)g|~ ˖N^̯<4y\5\۸n${t\X|i7o(]Z]8vP3MBf ~˪DZ^Y3>y8 fvBK&F?2K'_{?aԜYt4"Lba/sl|KnDsq<<-KScAJ-Iەn=f_E3k+ (~AQD. *8ߤ!U U2^OkF$7gw!;ۇa#AׄFa!c..)>M,LX)'g%i&{0#`#t$r)K:9g_g+Y)8 hisEW]o)^pRo{u쬊5$=׫`+0Fft.JJw5Mefe&0Sb^@V`&u=M4dx&hewOMa:_lnbMvcK+[3 ½F۰( 53q/7&9rط7D> D!)i ZnD"L#e`Si ,462-LOy#&j{pҿ)[KxlX#C%'hXBBs=񺝻(ZwB'J 5{a- FAt @@I\F{KDpy +ϝ +ϝ;"!Md"=!+P!qc)Kppb!>[e-!f%X"|ijEϠC) nC=)%5N׶Tbj52Υ3)aՔּQ1̥ ʀ7`{9-5ŔlRωiIRU9KSєAjS|c(8==@G!e2}+5TcT"1m\Zs9[[Emu^:V#dmNT)Xuf9a9F|sh:\AH I l1K=I %)́oVhrN^͍pK-%]OScK7^QI|KjKsS}R2cĎ=C2(FqivȰW-N WtWJu0Z1A<l?pfclwM o5Az_7:$Cu΅-d^_]J.#9=:. F$b( 1TИh#d{[!\ ẇH3!7lϱup+efSk' czrg8rR βUҜ\vNV8Ll?oȏ7@Ŭf-mkݹg?zWPh~F羧;=76#>K5w28cZ%X֘x3ƈS8ϼfV`w/}C$[zcwٰ/E ހv2,mZ{7 BSCaY殥yXp|}oW>n'x}Mq7}q|<;H^LVу`<^w'oQ Md}|c>;磃ޕo-0i nCOϫ$Z޽^vǏݸo[/o!n7<_oOoF Pp钊t=|Y5oNބXk0|MUZM?ܸ{cHxue-_@yS}us<̎vc> Cuu7~({ξz6cLvki-Yp$^٭ o FǠ'ן?LdR*4a -WΌL{y~p37[˯BkvYwH _Nz6&gLw΂p(},FRJd(\AvA_Q:ٷi)8 ;~):a):ť"=g$8R1pW%qbbhC% cIx]U+誕ϠD4Ω٢zzBY2[ʇ- +Szlu[rۄx=a%%RiʜJB$yHB\ -(gd`‰_ v; Wu'ZZ*oZg#,$@? 4$,,e[dk= Z`f(]Mo0c [nzsBYK҈AY$JB9O8gboVHw:6 c@(c  M_J$X'. baπJ 0M0dHZbaQ2x BǠV 8VPM3x+EyN*ͫMzrPli$rs p2& ,W"0r&xQ%Qh(_̯LeٿEfk|' cbB&(Tx&-AM$A2D ^^ '6m C뷾'gqRTED لM 1Ku`b˰n&逜Bhva?BӰȧ&f|23;M'``gC<05>y57P@FOK}EZDxђG&Y_0暦Yo`garIQ1#?@/4> w”pH R_7._CR*&#ɹ8NZb!Li BXL߇B,Zr\pp~;:.]=i `Ne$@Xv/0&Wk͎<ų0 T>+P8,^ h 08|25 yCa%9dFJ%AX(fT.'Z5㓵悆b2dc6:M(Bg9P ZezEpQJ`Q P L+-*n!~(Z?Xy ͮ"q8Q.ix  k]aU@elVU`{Mi'GOh1.3j vTUOH2oGLa+bc,=+dj3+ؠSy88{cH%Tp}6ٻ7n$ "mo~Ir _0 ٤=H#[h]IIRm_]bR4S7o|Sxgf9CecyKꕧPz ƿ?{i?K7_ 㽳+@#OGjr_N>>3]ۺkqu27HC~ztw%D2,BQzjjM^_W3 vo+!z䏫IV7GufWx_a^a}x1&|~gzOLV0ڛ9]SS14S58”*lV^O!q kk-&[?iD7u L~}ڞ>l%ʱ؈x]Q0_LsFij*C`^<@z%ixJ9,Av1wW &LS Rf)!<=HP usubD2%0MR B((F6_&2A J R7|@dWOh)l[]HИܐ?(*` g}_EU; m]RW2J?O0ǻ\ik.{q˅x|tW|!?bXܾŢ)AVߑ̇3>>̏tVyUe6j_EhI ܟ-5vHi҂1q[Zau,8fk[>- &~a=9%I@^ʥ^|&}HGFXSkeQLZ&J"eN}Z7ӂӏ”Wi7eJ*,Tx I Ua;*^VU7*pZSJrD`u㩰Zb17E9jW->a/|ݼȖ׆ t<"͵f̊d{@j+cŗ GTk @ez2 d^-Cd5c" ;oVe{@xwf^ldotjyTG4k1`DYW^hrzn^QwQb;F <_6K3+yD.(X(հ)< Uf( ʒ:" *T Ԥ$㸄%4,+pa2UC]5dc|,kV>="3(Xr<&(f$t:`r,U*ٛ!c+!{j 3?sV!c^U x]X! ZP3ٳ[Q,K sR\!j3U}a׷.!_ILt@)sP9t4@hA9B1zTu:FشKp.LRU{7L >7 '%xb4X:\P`Gf]aू'Q Ւ3mEsVVyQi2~-lh9HXSu"LiayƂa=;SqƲu:St:[fD&p(50N` hᤲTXG \;@)!ato!-A-!vm4 CXhlFMHoF)-ۂ'G]?w2߻iY .F3pul4j ~xy}8Q0 `!`/ c4Jso^fk.uD{bBRA5[j {=(&*j _׈gkLòQyl(m0g0)kogZ˫ F_Š`e?T_fY6c^-߻֠nnރ.RuCKz8ZĊ2y3VLP X,4&/I c )rg$ro-Âv}ȳI9|{3@fM$|: ÑuF`Zך=V:n Ak/$-E AйPoVnOd H¤pxyO8Q̼OqQ0R#0((_퍗5(Qx @/@禺P΁d<%yuIXUuY"򱠔hH;:`& J1(D`F,h`(|mtI@:x) I<?9P2-vXa71Ή0n&-Bk8N&BKnmZ.lV^gN0A$D9E )[f/}# bSeDZ %@b_^f- ǩQqhn:YGix띜*H<&EI#۫'/b+p;7&(ZF:%0swMVVDdή ׎3߄QHǛ&}fLUzzM4GJ`#i²tCu.Y2nJ+p@aE=SJunr')}JMV[Sjdlwtc^/wݗVXS`m0}Uts8nva@Y wۯH0bz87^(d ;=2#sZWw?l ?^|٧Jo ꨚww̯z coT+:#(Vg剞_ѱ#'s=G"4WJ%0ݦ7D*[$U[e#&xV|pV|7 VTdŪ$I8ʉLI3M` #%[| .ЊSFP0"C@)w?DP9)́x9(Ҟ`Qܵ8Y>q־b8'g}<^j:}z\Q֭\@ӦcBIDzNKWktTpԐeTs4D0y?G# w,}P8^R{*fM]LE<)fwa# k8_kOF(E;ծ> NȱRV4s=8 (ޤNtDsљ^HZt gV 'JMm5-[MlW7Qn^DiU!hc<q  GxR;L0ZNnw^- ~||<nb&nbkjv gA&ZZJ(E ASIDA,ʛ.}%}ߐĢM A[Y/)Vyy :/S")+[|"*S܆U+y gasB2S}i*=f"âf/:f؜rσ#=3"&T2}VSŬr|ƕ,UsA@ Ycv \mmRMڲeћI%m/5 c +_,AAyojP*(5A74 'g¹1.y: vԦcc ǟ|<{7oxwXhUڋ-«^GJA˷Θ?^>ub~s2ϧw7]:<zYx>̧7s[/QR/%Y\ҤX( x f<p<B`nZZiYopo2%&KpKY2w] K 16aHrJ?XQf)a3Sbg=جm83e*G <u!Ok$zԗ=?gP 8xTkIv~82BNz:t] TE+JB#Zk'AvP3lfo%=۞1.<zGfdT4i`:6%Կ&Q$ а:N%bUN%<+vBaUpmU/!-8O r-K>j'hXK|ZBXލQ4Ç:q@qR/K$u$/MT'֠0R$mz2Mʋ !t.v0>%,d.PN@c>8` $.MK@%I# NIUj'aAJԂ3'j笡RY5BXM=0aAҭ:]x)1|E;`_`Jɖ"kM4^#(FFS@BajxMT{GSA69Q9@@ Х4)\hyM6Ry-R[+5b$Nd$ jjM9';#ᳱ JlgZ۶_[ϩdv:>~&G3_eJ$Heib 6 Ps25!%#Q@afX2-X9W>!4d}_Wzu0cƆ>k/vW{9 kO5O\ |<ύ)m !$!A ccǀ1L Qk ` p !KST!PFTr}3Q@DAeXY-UX02j È`4)|= #aԼ4&DX —($*a! P@}5B3%f$-]KQ!X~ixR_j@;jyW]^ʖfK}6%׏_xALB%g]p,a@qy%b6Ot+R(l>=\ 6Čp}-=t?uD %Qp$7!R5ր^Mt NJSo| Qϲ;G?gl #{ I;ՀIEF't0b(otyz/A?ShY'>S|AJ &0(l> S- Ǒ}0 bt7w?&Ӗ@Vp,LnpQ`3*(/ԴUF0E*)iU|+P4,z B+0*e:@*c2Ijy\q2xJL/{]_|8fo0e~$R-K wZA0#AFz>ѻ#aDH\ ; @*~.G]6zb|oXt4ml>^>Ԅo2'ܳy|aPozZO*8;Ջ:Ҹ= -<[lzIմIDda۰0]9qD,Ceh.WYr+0u~TRN$I؟_p.~ e<}M{w Fy: VO9@;Fy0HcHZ>-YL+gBè i4Sp8r8]5{n)NܽN9GZz= )!ɵ3TqSiɦ?e/l^U5&iV )sxy@@{2=Wc=Հ8B $|,dq(}E0!FXE3Tp#!7yGݠs?̾śZR&~ͼP\9x LNk1؝İr,W /Fr@?YNՔ{>[~T)w`jx(7;e_g]23p3GxfŲ=N-3%m,WTnur~M}S15 YOfJ;X% Mq I\Jo_HbI*F:+vjR6\JJV @$ؖ"ڕ[,+x7^IG,Y,:A tmp7`<+5ȈkdWheZyUɮWWi7A[x ѱ7[0$HvPbN}6VtU /$:pi!uPPWDî!IewE= EwZ3A%dRxʹrqqR[JۯB*ĈvԺ9߭Rj15Xk<Β+Qh*UЙ pE e > hDtJ5LuQ*!g)C)Y~%R!]EMD&ҁT5 d 4P& I*G$֩B9 EQ@-Q*^}FйfFVqVaK3^f zs[9m|&88_?:+%/~niӡ޼ mfUjlvJż$jTԨ*()cH9W$Upq&d_Q$8 0aH b!G c5cC %'a̿A1]O0'zNhbPFUSZX\Fkv h}C E* N9TiVl|.lRRdŐqep %h%|ςUi{bq0־@ϗLIr~ewf_GTdd0ɭ6( 0FDա-{wuepn9/5WktR \2VC~eKǚf0%ķCmwwˍfb|7}P*_@6i(OdOoՌ^z=::ЈJU63))`慘ٝj^_">x^?b[9ₕc@HD>};GL-/u w9oc8x`ggGp7dfN#npUVq7bfOA+&vDYꝕuE itB I糉Ȼe-PqO|9Ha'8dP@R9g.{6F[E9YԛƩsmEJoE-RVN)ˍ-ʚ 럔k߼)(e WKyաY'ol7߿ SVry$aUP6L١򼝔)|S6Xxr/و>4'F3N$T>gyƒ,13_E{Oܔ$l3l"m^y2GY;}y?wLbI/wOO3E@fwxAIc\B'' #l,2$cjdCc%D5+Yi6BbBąRel@*&E8 d!$2L ,L0 $F!H"$DIY K HnCDyg! bFHacJ%c$d"((MH!Ȫ`y%$\r%5k H2I{ZޫCc5/^ޟ> ZxҩO<_s7_HHrj%ѻ立w/Fz$H^2z~?#t=zGuUH_9֓*‚'o5:HK"B%jDJ!Xs"]_Ɠɖ6S&{ws?=azV)ՙlkWsQ*k秴F*(h\c{mpjlWl,ù[Z=poSVOT)oݙֱޭ )ԉ S;$E] Nlwt|*.Mvr 9r}yɸ|FMe`rK>z.ݴ%;icby⣓JU4]Zbש[N 7>GDZbB2}RW*]TUFQNzn {qmxp @B nq|p;n-ѹ:u` BD^=Z!1ұsכS04OH&]#- -B%h ;SK4:p;.~qP;~[X*S:8'qҍY\cnP(Ĝf.^yCC9qmɤC"Xզa+-9\8:]? Ј=',L"$rUoU'qUi,"%rΝ0M DMK`"M֪ڧ?b//\@hn%_flW7tj@joua-v25˥u3Bs0:kuLV% T @r/m#Uݼr[g{J֚0(Г.gھϲl5sX2&nW9kw!}YuFˣYH_RSbC$NaF(z:1\_z+VrXMq;Wu:O{dǠxr MQH8׿}AJC0Tk ,4n%"Q&zfz\}5dMJ-3a GCU,Gx"5>UTDivOϋjx\8]܏`3#ױKzXcoZ>jT~| >'{ݡ?᣷]"i8\n1Vj竴 f EO o}=JrWZcrc}]fB\2VS~eK2ȚƏFw!~x y(Z0A;Qy^hP՞͟L`J x!"mnځ9 _0HD ӊ.uS>opxbje`ggGp7dfN#nҨXTYn.kz5h,g6Mo֘v*!5w{H~"8Vcs$WԈֻՖyV]Z1VSFYs,[k<^!y4GR+>+FvJ²wRBlV/\L "MΚ.2qÔUOA*W-_YTGRW#o|Yau/uN҇E;7IgM{ݛo)YWm2ܒ{}жW}pi.0#q撗>av;A 4G4 I#Fa9#m#tr}8ԑ빲ן.SEQ▙L/Sxamߚ0xOAi+>Q*>S +0ăg Kp>׏l I6!"Du dF#?Ǜl5(1YD+]oQ4s0 @2!*a02 w5` ʍ5KQB|Si@0gzǑ_ݑ0$٧O;vƖ{:$۲C2uǘ`:i"Y*j ԂS826PJJhHiC)X [X#u A^Ck~`ّɃQ}m%7 '^ޟ,+Х>i՛hsH D |}7v(P;2|v !Ct;|Cm5P+7 /Ɠmc‚go14%B(y޲Dp!ZAr%*Y BM\W̮xQ(zґg3rQ*<*<ӂKg+>U Q~2]MYV(wqy 0&bٛc5l% _Kys@G=!ިS=pPr&FrbѰ򨮏K^YK.D6~Z8Kq:-;|U~h9/e~h]=Ѕ+)Z]oN^vaE0(Y7-/ޙ_HI.Zn%rO#GMW!QӤְ=iqT7Ax|C="H>yK֭[CtPQ΁_g"s$8us($;Cit\@D=㪝8FCӈF묩&GkؖF!ٺ`Fq:Z($.CYMF3mT WUMY>/O#bb4W*v!XZ'ُ'E_FɵW^}u7F3dS(qEՀ6F4 sn qDVRnIM.]O^ ܋gG/O N'[o+iΎT$k.ơ;7g ~n;/>#*4_NyUIzhcV9w5J/:5J7u4V7YlQnCQ.mFHolEjngd]Z()P9Ϻ_==/EeQ B3VBân?M MJ M0I^Ա˶QtFav^#eɛΞ)oqih$2t7)ȟpɭ&6[Mެ؛袻:$ e!\?-l`bMDn/P%E Nx|ၙ][܈e?>uI=e+i`O)_\^T@+'FVXi?>,;~< wRjhݣ٩e 9 8oCxFx9aͬG@.pD#{sk'q#K^t>;P) ŚU}6P^%+`20k6!-$Hw+Gw5@A(N$E1 M4H deQdvB v͸JXrϫ\Iq5eTTh5PPY-,%x}e!k{#;"y0b~f1 e4\:z͒Q}W)[b>wCx;d(ooy;!z( pɶz1a|8B(y޶DdGBY,+)4`!ٴDyukKeʻ)y1#Sj0³Y/-Z7tRS508uN-mpp]p,/^,8ʫ#kZ[V_@N ån;sCh >NEF/'Y ?%L4lO:cA@Gh/w|KЏE:r/!-;#67퇎tB;XpwqCeEM. _Z7#R7#j|݈՘cmc [P /Sl:\{yݺu2wq*u3* 6h Vuq7u[Ct N8^а# t1"8ҽ5*F6rtC*4k'ΑqYhѸ,HЉiRѴN-uC]< v/w4 _Xd4Nj]Ǘyn'!3j& &7^rwnbsJb|Cv<>aȚhij$|@X.iswe{t6xf>f<&1U`^cg%QH;$Q5 `4QV?3$8z:QѭBIMt(47hrڍm'F 5AnfКFQ[񒍂E66fK򪓸*)z1G:M hU:C0αHNHqO܋k RDQە+@֞] B0\}FhӍ?DJ꺇Բ`7ZAEzi&0.{snϸ܋gG/dr(yMٓjʬ;ۿ:fzf9,,Li9cp;;B32ɭBC4/pNa! uၻp!nilEJyN]4O{YɺAQRruzz_:) ő7M2,?&M()H`CBeۨyjH.x򦳧w;C 4Zd\&#% 4F pgvsAV}rMhg$|4[M5s*3iM|@[y?)]'Q[-Gf^*O>¯X"oHɕ%f=$6?8fpq`T8Rc/OftM"~)gQ*t4i|2/dSsdWKQQ21Ob|#`0^v~s.h'i6\A$r@/^>.9wZY/A:}J ǩgJ6oxfWю*mB#ѵd8},Ƿi2гeCQvYjf 3f]%H?qdDݛ1:GHWTG5@B^|]tGj1lc$2Ӳ}e Rmu 5Rc) 5EMW^bĞΞL`Jv0&"Uz(ob-A1y-昙} 9}Z2+:5?lG6 l_Jˋ evQ|"ízXz} lf(\_m@2 ۻ=;Ki*F> A$RӀ ,#pa2 WZ@C,_W8 /;c8AaxwqƟ$KDfjn+H9FE 1r6a TΖipIۢ rݮoKd2n:298 2R!:DH+F:8D CInm\kv.3q;0q s㐲tDB*Y9!4KsJa-HC( P@  B# l+U@"!a´F>) K0O,DӝuL]2Sև$yk/\d3iϤ|]Bb[x-YI*>Dz{h.Q>~q^<͓pg$[{/ oɲ3n~x Q2CG^x|UkיC wȀ "_x0_gdKƬx( C=f^L6Izg0L;$fqL|!6q_)|L BuǿxUh )^jւ1dAE0Z(h y3x? nxpܞ*4U;GȌ89uY + 5%dqǐI OHͬN >_<`%(lDamSO/( vOO]#Y/=P}.yރŹld_}Вlg栗/rg$..LWUuk C*ֶ*;I*}iiWhEUɐ&Kj *@ wrڸ%r[~I[hY:]묇Gt-nf;\Dy4"2mg4TsmŖ'%=,=6fɃ ވoW#.ڒ*C',PݏN%i$ Bā'珿ˌR>$]28E a ֯=Z797D1MddlJl$yM` VEr'F1kS_ba!E"UHw#F$-Td}ByEl%Y2r?6]BWGh Bvqpw$1Xbs!DOxtA3!5Ϳ93Wh,Pcp+Ex</r4-h 7O|smk P91YG9}hāsq (i5۱PG-< PS"UZCT;5@h̤i_๓TDx*CPYDQ#({w4PgBr8KPp81-N,>14T(% }i$\>O X!@ \'n ͌|z]@ W@ )nBB%E L;&I.:<*<||!R"E'@FHG' x@t/"Pug@ x}#WB\咯 K>sn*!Tb+h { RYeq֖^05۩!9R(TڛzլOZd0JB;4Siʒޝ,KtӣWROގJ~)ϵkCYEG"HVyYgFd8ELDufԲ!a hVi;[  B*h!']c@myY(R !1kB>j/|"k %&9-Vv_8!RzGw;1C)u5H0Klb>NFKdᤃh6saSdmlшA@kЛk y!wk<߿eBapR]*d XD4H : 3 [>^۹E~{-Y=Z~)Db+? "x̝ea~E YĿzeO|iv㙰K*u8vE:K>h byP]-Ζe/~yg3Ґ7T *^Z0&jKx"鼐>Jj͏}H/_&L ŭ%bvdRF` !,!j \ajyb~ŭM!0E,{8=]/0IbyWIH)Gr,Eb(G27IZΘgRi}ҚF FajMq{1M#" ͧ%~OҮc;ow.o~ee!%PK0KzC5X,%*;؟ =Y0g(ž@U,&+Aˍ $ҦG]՜7|^GGN:XĜm'|~G"xs`Ly:ߗeyVJEAԺCvO߳hۡvytM5w;p(Wh\DQń r@k^" H(ԧJm旽1BTD^UK!K_yzky~ Zrq`b8qͷVf;M_Te 7n5ܨVu% 8,2icQTH ^Iq?25?.X|y=x;^/tXi6y"Бߠ x?K?kҡ Y 5`Xb֔"Q/yip_;E)%)_5ᨣwn2FDP&q#Q 0}FvnJ)h~CNvO[ u!a4'cpb4C(g?qK,>ERHcl0PK3F yu0p$ - [M*HB`Z中#0TG(F-7^&n]DTlq\/ϴfE5L oRh.A w4U/:xRy)o|Tu_Jn8S;7~' hןRrεRIΙ9z`4p n/IO pHT1J S%kN/ S| 8Zi 뱟n`6% 5V3Qxn`f@W nDnN8@x tv A+F&( P*@ki`nEc gB7l BHvoX `.sjOU3 M.5>_m\-{84 @кV n(a@hW"D!bH È-R wyo{)Xu.2f.^9Ô۴Sӛ9Zѝ1$k0AU;D$5##$&|$ֆ3t\]h'YozN(@xN\^Ց"?~~ɤ]B YZ?Zw;(~`?ق[ }s ɹ>8/1ޡ-G5L[3;ysq}ߑ'`SVlTu>!.ۖ !x˯f&~1Wa Xk@ nQ1BP E~n4Ő5<,c[uaW j-VcgC0k~x?ȾvڪM% Gk~ڣI u)ȣ r!߫GK?[H&/hs)͇û >8xDK}]^H_ջE QcْFz%jV5|v[4CnԧmQxކL2ۇW3>_?~[k'N3ev(MS)a yyo 9"N/n)Ip S+J= dT }pɨRTiqdJ<}%ha %CBUk:{JF5។k\5 7Jy׾nP|_'a 9Q{j@ Pj4;r&zT,/^oa{iU ,R1ȠWJtdS^hvOQӜA-HL$B2[u_kE P}Z]$x^}*4Je%g GZZn'=; B:#938[okjj dMdֆˮ6<_SD`kxT c%˷aW!wCdtjFj&z*'8$\kMOGVv}Ն g~9iz~ , r$8~j/ojMϷuf=Ɲ{Kk rZLoo/+ھ3u~xF3Jg6?O7$шSj̵Ҵ5_ғn+e)I 62%ksNU'oj7v ڭ-%M1,nӃڭ y"Z)Q8=.WD͎rl ݛ7Mj1za3kS@_RZMRPrQM?r rFZ`OYmYq>/$^n0?{f˫il?ڊn8quo9m)k#%aEi }x[2{Zq6Gj?NMp*ΒnM7gLt;6[\)]X zʋkr % RܓQ b W yX9>oj[Ss4&Ȍrvߘ#?c]nu-VpmW [ ~q#g=BIU`T 2\mcU+yf҃ Iǐ{{H+ZU~rE{ ]aytJ;WrZvrrHwZO0/xi+Jr+;H[ޗn#@ wxo\+V%- zWB7 рtŲ |xPSC'<6GhKOa}oRr"ǚ 3U?RR*# cX0<2"+K(vSO*Ӂp7@ܠIa @g0Te)\4䥂JRŋ (2FP 9S- Cb [ jiBA^I`~4iU`Y^f9l`=1CΥXԱ/nn_'Z!9[)d0^***dJ%Eʥ)S[&rIB&PPEa,mRCu<6ܥvq{?o^eFs餘FگԠjQJJ//寋Q \3~~,God4`3؟%Ogu|t,۾cM%#wyvr?d>"w{WAr>Mߨb:9$ MoVڵ:+`Q)k>Ǿ{]mJkF\ zgn`cojBnݮ2O$WK$,~ kfʴҀQi1!B'-YN}S% Fߜ=C;lci#!{!9#GVA}gW~f %Lõj(h tFt>Bδ_=lMQKe7L=y(YqkY;v1Z\@;/&Q« squ:2w쇇v]??H !'.5 XfЂ3eX;];ahO}'U0 LOaX<vp !9O6Z+!RGo5g˨!a6.k=Dmi.5}r]jD>cb:bSS;#Ɗ{<ڰwnQ6EcwcXpDľ#Ļ/rywϗyz6,䝛hMIsӻ X |L'!혳;Ъμ[MOֆsmlS서o<>5l!=3y"r%u;qGPoxu?rnҥ 10525ms (15:47:31.599) Jan 27 15:47:31 crc kubenswrapper[4713]: Trace[1402113427]: [10.525395547s] [10.525395547s] END Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.600032 4713 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.600730 4713 trace.go:236] Trace[751532850]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 15:47:17.179) (total time: 14420ms): Jan 27 15:47:31 crc kubenswrapper[4713]: Trace[751532850]: ---"Objects listed" error: 14420ms (15:47:31.600) Jan 27 15:47:31 crc kubenswrapper[4713]: Trace[751532850]: [14.420662715s] [14.420662715s] END Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.600992 4713 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.601975 4713 trace.go:236] Trace[537749470]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 15:47:21.588) (total time: 10013ms): Jan 27 15:47:31 crc kubenswrapper[4713]: Trace[537749470]: ---"Objects listed" error: 10013ms (15:47:31.601) Jan 27 15:47:31 crc kubenswrapper[4713]: Trace[537749470]: [10.013872369s] [10.013872369s] END Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.601989 4713 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.602458 4713 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.602837 4713 trace.go:236] Trace[404940062]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (27-Jan-2026 15:47:21.542) (total time: 10059ms): Jan 27 15:47:31 crc kubenswrapper[4713]: Trace[404940062]: ---"Objects listed" error: 10059ms (15:47:31.602) Jan 27 15:47:31 crc kubenswrapper[4713]: Trace[404940062]: [10.059933804s] [10.059933804s] END Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.602862 4713 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.619594 4713 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.624655 4713 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:40658->192.168.126.11:17697: read: connection reset by peer" start-of-body= Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.624734 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": read tcp 192.168.126.11:40658->192.168.126.11:17697: read: connection reset by peer" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.625151 4713 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.625225 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.642651 4713 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.837313 4713 apiserver.go:52] "Watching apiserver" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.840648 4713 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.840985 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.841465 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.841573 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.841595 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.841695 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.841709 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.842347 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.842350 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.842543 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.842602 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.844363 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.844364 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.844573 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.844637 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.844774 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.845528 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.845591 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.845749 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.845748 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.853351 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 11:13:01.031738523 +0000 UTC Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.870037 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.880643 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.894990 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.906675 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.923137 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.933266 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.940654 4713 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.942939 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.943923 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944015 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944065 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944099 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944116 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944134 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944155 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944172 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944195 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944215 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944278 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944299 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944317 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944332 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944368 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944390 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944409 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944431 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944430 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944451 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944449 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944470 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944542 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944565 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944586 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944605 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944655 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944673 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944691 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944670 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944706 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944795 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944834 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944859 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944881 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944923 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944958 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944974 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.944988 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945109 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945151 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945186 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945211 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945237 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945260 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945272 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945284 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945307 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945329 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945351 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945356 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945387 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945421 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945481 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945510 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945504 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945537 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945566 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945608 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945635 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945660 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945691 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945716 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945738 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945763 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945786 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945811 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945834 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945862 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945889 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945916 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945941 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945969 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945993 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946022 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946067 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946092 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946115 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946140 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946360 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946388 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946416 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946439 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946463 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946486 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946510 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946580 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946605 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946630 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946679 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946706 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946727 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946745 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946769 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946787 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946806 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946826 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946843 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946863 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946879 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946904 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946928 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946949 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946972 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946998 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947019 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947058 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947080 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947103 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947124 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947150 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947171 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947225 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947252 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947283 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947309 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947335 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947360 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947388 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947413 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947438 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947462 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947484 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947509 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947533 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947556 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947580 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947602 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947625 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947648 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947682 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947709 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947734 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947756 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947779 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947800 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947822 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947847 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947874 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947898 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947919 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947942 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947969 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947991 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948016 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948149 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948183 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948212 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948235 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948262 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948286 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948312 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948335 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948440 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948478 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948501 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948525 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948547 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948571 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948872 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948896 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948921 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948948 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948975 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949001 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949031 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949081 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949107 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949131 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949156 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949185 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949215 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949243 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949272 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949300 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949341 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949371 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949401 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949430 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949473 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949499 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949528 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949562 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949586 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949613 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949642 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949674 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949704 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949732 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949759 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949787 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949813 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949844 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949870 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949902 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949928 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949957 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949984 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950010 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950058 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950090 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950120 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950148 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950176 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950233 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950265 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950294 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950361 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950386 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950415 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950445 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950472 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950499 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950533 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950566 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950591 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950621 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950642 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950726 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950738 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950748 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950759 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950772 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950782 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950792 4713 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950802 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950812 4713 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.951196 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.964992 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.966752 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945564 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945584 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945583 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945710 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945908 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945949 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.945980 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946513 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946545 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946542 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946618 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946647 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946734 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.946799 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947128 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947683 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947681 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947824 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947854 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947946 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.947979 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948004 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948060 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948491 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948780 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.948852 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949026 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949250 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949284 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949295 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949341 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949519 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949620 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.967069 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.967159 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.967195 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949910 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949928 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.949981 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.950674 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.951010 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.951132 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.951544 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.951560 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.951785 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.952531 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.952543 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.952910 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.953126 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.952565 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.953397 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.953879 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.954364 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.954619 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.954509 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.954789 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.954853 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.954894 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.955102 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.955210 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.955341 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.955630 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.955722 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.955778 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.955830 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.955845 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.955597 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.956194 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.956514 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.956567 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.956658 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.956981 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.957811 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.958024 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.958055 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.958124 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.958204 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.958436 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.958474 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.958561 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.958695 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.958761 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.959068 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.959160 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.959228 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.959426 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.959467 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.959680 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.959730 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.959765 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.959821 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.960120 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.960492 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.960767 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.961051 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.961072 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.960926 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.961184 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.961243 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.961300 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.961537 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:47:32.4614734 +0000 UTC m=+20.239683328 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.968115 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.961663 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.961666 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.961872 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.961885 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.961948 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.961964 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.962162 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.962207 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.962302 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.962387 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.962480 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.962496 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.962515 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.962573 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.962664 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.962725 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.962878 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.963079 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.963108 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.963117 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.963379 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.963396 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.963633 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.963651 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.963670 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.963822 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.963976 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.963997 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.964185 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.964205 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.964247 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.964418 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.964652 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.968449 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.964670 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.964940 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.965354 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.965384 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.965444 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.965546 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.965582 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.965771 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.965793 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.966110 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.966123 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.966253 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.966360 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.966761 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.968273 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:32.468245086 +0000 UTC m=+20.246455024 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.969492 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:32.469467744 +0000 UTC m=+20.247677672 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.970100 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.970533 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.970985 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.971176 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.974170 4713 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.975496 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.993947 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.994163 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.994860 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.995028 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.995071 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.995110 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.995211 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:32.495160278 +0000 UTC m=+20.273370436 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.995388 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.995407 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.995438 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:31 crc kubenswrapper[4713]: E0127 15:47:31.995472 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:32.495463367 +0000 UTC m=+20.273673395 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.995615 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:31 crc kubenswrapper[4713]: I0127 15:47:31.997182 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.000282 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.002011 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.002179 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.002247 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.002188 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.002665 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.002664 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.002689 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.003257 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.003910 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.003984 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.004227 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.004692 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.004968 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.005077 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.005329 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.005494 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.005512 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.005736 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.010555 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.010771 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.013234 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.014430 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.014463 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.014641 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.014708 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.015198 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.015201 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.015435 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.015662 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.015469 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.015902 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.018792 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.020543 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46" exitCode=255 Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.020624 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46"} Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.031296 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.032835 4713 scope.go:117] "RemoveContainer" containerID="d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.036556 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.040945 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.045732 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.048018 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.050137 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.051836 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.051898 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.051947 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.051970 4713 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.051985 4713 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.051997 4713 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052009 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052022 4713 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052053 4713 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052068 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052081 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052084 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052092 4713 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052103 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052115 4713 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052128 4713 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052140 4713 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052151 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052162 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052172 4713 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052183 4713 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052194 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052205 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052215 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052226 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052237 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052248 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052261 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052272 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052283 4713 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052295 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052307 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052318 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052328 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052337 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052345 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052353 4713 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052362 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052370 4713 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052377 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052386 4713 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052395 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052404 4713 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052412 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052420 4713 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052428 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052436 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052444 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052452 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052459 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052467 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052475 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052483 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052491 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052498 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052506 4713 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052513 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052522 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052529 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052538 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052546 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.052730 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.053930 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054384 4713 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054415 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054482 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054553 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054574 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054605 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054619 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054636 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054646 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054656 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054666 4713 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054682 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054693 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054703 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054718 4713 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054728 4713 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054756 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054841 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054860 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054875 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054892 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054909 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054925 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054934 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054945 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054958 4713 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054971 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.054986 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.055000 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.055171 4713 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.055183 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.055242 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.055361 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.055810 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.055836 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.055932 4713 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.055992 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056024 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056192 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056214 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056306 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056414 4713 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056604 4713 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056630 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056673 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056795 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056813 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056823 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056839 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056849 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056858 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056871 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056880 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056890 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056901 4713 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056916 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056927 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056937 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056949 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056962 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056972 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.056982 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057012 4713 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057027 4713 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057103 4713 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057190 4713 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057450 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057490 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057504 4713 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057514 4713 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057529 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057539 4713 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057552 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057565 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057574 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057585 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057594 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057607 4713 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057618 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057629 4713 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057638 4713 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057650 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057659 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057669 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057679 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057689 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057698 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057709 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057722 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057731 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057740 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057749 4713 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057763 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057772 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057783 4713 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057793 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057805 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057816 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057825 4713 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057837 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057847 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057855 4713 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057890 4713 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057904 4713 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057914 4713 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057894 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.057925 4713 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058027 4713 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058064 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058077 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058089 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058106 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058118 4713 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058130 4713 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058143 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058157 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058170 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058180 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058194 4713 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058210 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058223 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058234 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058245 4713 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058256 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058266 4713 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058275 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.058286 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.069052 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.082199 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.106775 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.157133 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.165499 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.171525 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 27 15:47:32 crc kubenswrapper[4713]: W0127 15:47:32.177732 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-b8e3b5022be0d4020f68eab3cd33fff081f20deb699393991f9ca60ebe4aa70f WatchSource:0}: Error finding container b8e3b5022be0d4020f68eab3cd33fff081f20deb699393991f9ca60ebe4aa70f: Status 404 returned error can't find the container with id b8e3b5022be0d4020f68eab3cd33fff081f20deb699393991f9ca60ebe4aa70f Jan 27 15:47:32 crc kubenswrapper[4713]: W0127 15:47:32.179455 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-699f9f29e87c1eb4f637a67f2aac72b70d1651b884ffaf2763b418ab253cd50e WatchSource:0}: Error finding container 699f9f29e87c1eb4f637a67f2aac72b70d1651b884ffaf2763b418ab253cd50e: Status 404 returned error can't find the container with id 699f9f29e87c1eb4f637a67f2aac72b70d1651b884ffaf2763b418ab253cd50e Jan 27 15:47:32 crc kubenswrapper[4713]: W0127 15:47:32.189321 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-cbac7d1c11e516bc5ef047733714c8f824bbf39363f4ce8baf41bf15d6661e99 WatchSource:0}: Error finding container cbac7d1c11e516bc5ef047733714c8f824bbf39363f4ce8baf41bf15d6661e99: Status 404 returned error can't find the container with id cbac7d1c11e516bc5ef047733714c8f824bbf39363f4ce8baf41bf15d6661e99 Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.461757 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:47:32 crc kubenswrapper[4713]: E0127 15:47:32.461933 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:47:33.461905519 +0000 UTC m=+21.240115457 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.563269 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.563351 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.563409 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.563437 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:32 crc kubenswrapper[4713]: E0127 15:47:32.563480 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:47:32 crc kubenswrapper[4713]: E0127 15:47:32.563511 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:47:32 crc kubenswrapper[4713]: E0127 15:47:32.563526 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:32 crc kubenswrapper[4713]: E0127 15:47:32.563536 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:47:32 crc kubenswrapper[4713]: E0127 15:47:32.563635 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:33.563613492 +0000 UTC m=+21.341823430 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:32 crc kubenswrapper[4713]: E0127 15:47:32.563658 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:33.563649373 +0000 UTC m=+21.341859311 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:47:32 crc kubenswrapper[4713]: E0127 15:47:32.563537 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:47:32 crc kubenswrapper[4713]: E0127 15:47:32.563733 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:47:32 crc kubenswrapper[4713]: E0127 15:47:32.563745 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:32 crc kubenswrapper[4713]: E0127 15:47:32.563546 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:47:32 crc kubenswrapper[4713]: E0127 15:47:32.563792 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:33.563768827 +0000 UTC m=+21.341978825 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:32 crc kubenswrapper[4713]: E0127 15:47:32.563934 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:33.563887981 +0000 UTC m=+21.342097969 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.854897 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 11:02:40.277057017 +0000 UTC Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.903179 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.904089 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.904858 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.905710 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.906530 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.907064 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.907651 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.908221 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.908940 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.909479 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.910053 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.910729 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.911297 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.911845 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.914363 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.915101 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.915915 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.916904 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.917331 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.917908 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.918530 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.919022 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.919826 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.920364 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.921023 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.921500 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.922128 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.922809 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.923330 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.923970 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.924585 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.925132 4713 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.925416 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.928590 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.928917 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.930422 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.931168 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.933094 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.933958 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.934879 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.935733 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.936656 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.937371 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.938218 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.939089 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.939849 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.940490 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.942066 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.943420 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.945208 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.945526 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.946666 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.947324 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.947956 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.948741 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.949615 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.950875 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.960130 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.974136 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:32 crc kubenswrapper[4713]: I0127 15:47:32.988867 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.004817 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.028544 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5"} Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.028610 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f"} Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.028622 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"699f9f29e87c1eb4f637a67f2aac72b70d1651b884ffaf2763b418ab253cd50e"} Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.031071 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27"} Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.031137 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"b8e3b5022be0d4020f68eab3cd33fff081f20deb699393991f9ca60ebe4aa70f"} Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.034321 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.036558 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac"} Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.037326 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.038578 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"cbac7d1c11e516bc5ef047733714c8f824bbf39363f4ce8baf41bf15d6661e99"} Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.043129 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.057637 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.076031 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.090284 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.117948 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.148869 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.163808 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.181902 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.194785 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.207994 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.223609 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.237908 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.250358 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.270611 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.471742 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.472003 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:47:35.47196369 +0000 UTC m=+23.250173628 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.573303 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.573361 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.573382 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.573401 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.573569 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.573588 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.573602 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.573622 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.573679 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:35.573656593 +0000 UTC m=+23.351866531 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.573683 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.573702 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.573729 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:35.573722055 +0000 UTC m=+23.351931993 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.573641 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.573793 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:35.573787847 +0000 UTC m=+23.351997785 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.573843 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.573880 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:35.573872019 +0000 UTC m=+23.352081957 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.855783 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 01:57:28.680654127 +0000 UTC Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.899390 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.899424 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:33 crc kubenswrapper[4713]: I0127 15:47:33.899463 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.899555 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.899775 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:33 crc kubenswrapper[4713]: E0127 15:47:33.899698 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.054792 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.059278 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.065701 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.071025 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.085172 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.099622 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.113619 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.127968 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.141717 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.157445 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.171960 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.185745 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.201477 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.213351 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.228510 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.242215 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.256370 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.271005 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:34Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:34 crc kubenswrapper[4713]: I0127 15:47:34.855937 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 14:40:23.651286997 +0000 UTC Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.046413 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5"} Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.061897 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:35Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.076994 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:35Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.096605 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:35Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.126100 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:35Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.145642 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:35Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.157415 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:35Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.171286 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:35Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.186242 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:35Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.493981 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.494301 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:47:39.494249687 +0000 UTC m=+27.272459665 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.595500 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.595555 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.595588 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.595627 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.595737 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.595808 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.595853 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.595872 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.595872 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:39.595843856 +0000 UTC m=+27.374053804 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.595747 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.595923 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.595941 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.595946 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:39.595925149 +0000 UTC m=+27.374135087 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.595972 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:39.5959649 +0000 UTC m=+27.374174958 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.595763 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.596007 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:39.596001451 +0000 UTC m=+27.374211499 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.747507 4713 csr.go:261] certificate signing request csr-sqwt7 is approved, waiting to be issued Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.793657 4713 csr.go:257] certificate signing request csr-sqwt7 is issued Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.856690 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 04:50:17.627791147 +0000 UTC Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.898657 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.898653 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:35 crc kubenswrapper[4713]: I0127 15:47:35.898669 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.899002 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.898993 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:35 crc kubenswrapper[4713]: E0127 15:47:35.898780 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.156210 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.175110 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.175332 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.191379 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.206889 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.210227 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.238163 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.255421 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.270849 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.286815 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.319247 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.358508 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.398584 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.432844 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.457932 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.477003 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.498037 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.517687 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.537442 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.552684 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.621647 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-n7wxq"] Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.622132 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.623770 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.624601 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-j5sgs"] Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.625925 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.626409 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-6h5wz"] Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.626666 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.627196 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.627456 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j5sgs" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.631593 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.632188 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.633290 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.633446 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.634111 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.636585 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.636787 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.639546 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.639966 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.640188 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.649147 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.661291 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.674322 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.687730 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.704789 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-multus-socket-dir-parent\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.704833 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9bd62e63-357b-4f16-a2a1-e6a1d2375808-rootfs\") pod \"machine-config-daemon-6h5wz\" (UID: \"9bd62e63-357b-4f16-a2a1-e6a1d2375808\") " pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.704958 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-run-k8s-cni-cncf-io\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705013 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-var-lib-cni-multus\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705076 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-multus-conf-dir\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705102 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf07c585-f90e-4416-a66c-d41547008320-multus-daemon-config\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705154 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7dab7b05-4eca-4f3d-b53d-ad1e0042cd55-hosts-file\") pod \"node-resolver-j5sgs\" (UID: \"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\") " pod="openshift-dns/node-resolver-j5sgs" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705181 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-system-cni-dir\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705202 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-run-netns\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705235 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86thf\" (UniqueName: \"kubernetes.io/projected/bf07c585-f90e-4416-a66c-d41547008320-kube-api-access-86thf\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705282 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-os-release\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705320 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf07c585-f90e-4416-a66c-d41547008320-cni-binary-copy\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705344 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9bd62e63-357b-4f16-a2a1-e6a1d2375808-mcd-auth-proxy-config\") pod \"machine-config-daemon-6h5wz\" (UID: \"9bd62e63-357b-4f16-a2a1-e6a1d2375808\") " pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705367 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj8qv\" (UniqueName: \"kubernetes.io/projected/9bd62e63-357b-4f16-a2a1-e6a1d2375808-kube-api-access-vj8qv\") pod \"machine-config-daemon-6h5wz\" (UID: \"9bd62e63-357b-4f16-a2a1-e6a1d2375808\") " pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705406 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tp9nt\" (UniqueName: \"kubernetes.io/projected/7dab7b05-4eca-4f3d-b53d-ad1e0042cd55-kube-api-access-tp9nt\") pod \"node-resolver-j5sgs\" (UID: \"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\") " pod="openshift-dns/node-resolver-j5sgs" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705426 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-multus-cni-dir\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705446 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-cnibin\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705469 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-var-lib-kubelet\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705494 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-hostroot\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705520 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-var-lib-cni-bin\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705548 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-run-multus-certs\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705568 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-etc-kubernetes\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.705590 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9bd62e63-357b-4f16-a2a1-e6a1d2375808-proxy-tls\") pod \"machine-config-daemon-6h5wz\" (UID: \"9bd62e63-357b-4f16-a2a1-e6a1d2375808\") " pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.709065 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.723377 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.737461 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.750204 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.762243 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.773254 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.785783 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.795555 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-27 15:42:35 +0000 UTC, rotation deadline is 2026-11-10 02:07:06.137627708 +0000 UTC Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.795593 4713 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6874h19m29.342038193s for next certificate rotation Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.798875 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807017 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-run-k8s-cni-cncf-io\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807138 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-var-lib-cni-multus\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807165 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-multus-conf-dir\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807180 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-run-k8s-cni-cncf-io\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807208 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf07c585-f90e-4416-a66c-d41547008320-multus-daemon-config\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807259 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-var-lib-cni-multus\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807334 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7dab7b05-4eca-4f3d-b53d-ad1e0042cd55-hosts-file\") pod \"node-resolver-j5sgs\" (UID: \"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\") " pod="openshift-dns/node-resolver-j5sgs" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807283 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7dab7b05-4eca-4f3d-b53d-ad1e0042cd55-hosts-file\") pod \"node-resolver-j5sgs\" (UID: \"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\") " pod="openshift-dns/node-resolver-j5sgs" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807366 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-system-cni-dir\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807382 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-run-netns\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807371 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-multus-conf-dir\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807422 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86thf\" (UniqueName: \"kubernetes.io/projected/bf07c585-f90e-4416-a66c-d41547008320-kube-api-access-86thf\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807452 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-os-release\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807455 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-system-cni-dir\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807462 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-run-netns\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807468 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf07c585-f90e-4416-a66c-d41547008320-cni-binary-copy\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807552 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9bd62e63-357b-4f16-a2a1-e6a1d2375808-mcd-auth-proxy-config\") pod \"machine-config-daemon-6h5wz\" (UID: \"9bd62e63-357b-4f16-a2a1-e6a1d2375808\") " pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807582 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj8qv\" (UniqueName: \"kubernetes.io/projected/9bd62e63-357b-4f16-a2a1-e6a1d2375808-kube-api-access-vj8qv\") pod \"machine-config-daemon-6h5wz\" (UID: \"9bd62e63-357b-4f16-a2a1-e6a1d2375808\") " pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807610 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-cnibin\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807637 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-var-lib-kubelet\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807661 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-hostroot\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807696 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tp9nt\" (UniqueName: \"kubernetes.io/projected/7dab7b05-4eca-4f3d-b53d-ad1e0042cd55-kube-api-access-tp9nt\") pod \"node-resolver-j5sgs\" (UID: \"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\") " pod="openshift-dns/node-resolver-j5sgs" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807719 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-multus-cni-dir\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807741 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-etc-kubernetes\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807763 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9bd62e63-357b-4f16-a2a1-e6a1d2375808-proxy-tls\") pod \"machine-config-daemon-6h5wz\" (UID: \"9bd62e63-357b-4f16-a2a1-e6a1d2375808\") " pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807784 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-var-lib-cni-bin\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807806 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-run-multus-certs\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807839 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-multus-socket-dir-parent\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807864 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9bd62e63-357b-4f16-a2a1-e6a1d2375808-rootfs\") pod \"machine-config-daemon-6h5wz\" (UID: \"9bd62e63-357b-4f16-a2a1-e6a1d2375808\") " pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807893 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-var-lib-kubelet\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807863 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-os-release\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807912 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-cnibin\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807937 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-var-lib-cni-bin\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807954 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-host-run-multus-certs\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807968 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-multus-socket-dir-parent\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807974 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-etc-kubernetes\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807996 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/9bd62e63-357b-4f16-a2a1-e6a1d2375808-rootfs\") pod \"machine-config-daemon-6h5wz\" (UID: \"9bd62e63-357b-4f16-a2a1-e6a1d2375808\") " pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.807959 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-hostroot\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.808187 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bf07c585-f90e-4416-a66c-d41547008320-multus-cni-dir\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.808436 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9bd62e63-357b-4f16-a2a1-e6a1d2375808-mcd-auth-proxy-config\") pod \"machine-config-daemon-6h5wz\" (UID: \"9bd62e63-357b-4f16-a2a1-e6a1d2375808\") " pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.808484 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bf07c585-f90e-4416-a66c-d41547008320-cni-binary-copy\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.808845 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bf07c585-f90e-4416-a66c-d41547008320-multus-daemon-config\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.816554 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.819950 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9bd62e63-357b-4f16-a2a1-e6a1d2375808-proxy-tls\") pod \"machine-config-daemon-6h5wz\" (UID: \"9bd62e63-357b-4f16-a2a1-e6a1d2375808\") " pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.829983 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86thf\" (UniqueName: \"kubernetes.io/projected/bf07c585-f90e-4416-a66c-d41547008320-kube-api-access-86thf\") pod \"multus-n7wxq\" (UID: \"bf07c585-f90e-4416-a66c-d41547008320\") " pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.833833 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tp9nt\" (UniqueName: \"kubernetes.io/projected/7dab7b05-4eca-4f3d-b53d-ad1e0042cd55-kube-api-access-tp9nt\") pod \"node-resolver-j5sgs\" (UID: \"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\") " pod="openshift-dns/node-resolver-j5sgs" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.839329 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.840791 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj8qv\" (UniqueName: \"kubernetes.io/projected/9bd62e63-357b-4f16-a2a1-e6a1d2375808-kube-api-access-vj8qv\") pod \"machine-config-daemon-6h5wz\" (UID: \"9bd62e63-357b-4f16-a2a1-e6a1d2375808\") " pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.857183 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 22:27:35.968272534 +0000 UTC Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.889694 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.925826 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.941092 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-n7wxq" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.947491 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-j5sgs" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.953316 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.955481 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.971710 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:36 crc kubenswrapper[4713]: W0127 15:47:36.972708 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bd62e63_357b_4f16_a2a1_e6a1d2375808.slice/crio-1198b64cacd13fa6aa3f51003843702857ead65a0d9c49fe7e802a987e90a243 WatchSource:0}: Error finding container 1198b64cacd13fa6aa3f51003843702857ead65a0d9c49fe7e802a987e90a243: Status 404 returned error can't find the container with id 1198b64cacd13fa6aa3f51003843702857ead65a0d9c49fe7e802a987e90a243 Jan 27 15:47:36 crc kubenswrapper[4713]: W0127 15:47:36.975630 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dab7b05_4eca_4f3d_b53d_ad1e0042cd55.slice/crio-e3d394bd2febe90943e7ac9dc4e14b664197752bb5a144eb77f984ab1a1eea0f WatchSource:0}: Error finding container e3d394bd2febe90943e7ac9dc4e14b664197752bb5a144eb77f984ab1a1eea0f: Status 404 returned error can't find the container with id e3d394bd2febe90943e7ac9dc4e14b664197752bb5a144eb77f984ab1a1eea0f Jan 27 15:47:36 crc kubenswrapper[4713]: I0127 15:47:36.992284 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:36Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.006590 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-gb57w"] Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.008134 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.008694 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.010386 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.010503 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.028600 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.043336 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.053369 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n7wxq" event={"ID":"bf07c585-f90e-4416-a66c-d41547008320","Type":"ContainerStarted","Data":"ed71e11fdfad845dfc89a979b4e49929bb7d8a0f716e168bc13d418adfa38c9d"} Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.055618 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j5sgs" event={"ID":"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55","Type":"ContainerStarted","Data":"e3d394bd2febe90943e7ac9dc4e14b664197752bb5a144eb77f984ab1a1eea0f"} Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.056705 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerStarted","Data":"1198b64cacd13fa6aa3f51003843702857ead65a0d9c49fe7e802a987e90a243"} Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.063323 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: E0127 15:47:37.065162 4713 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-crc\" already exists" pod="openshift-etcd/etcd-crc" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.076068 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.088977 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.104475 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.110848 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9e1879ed-280f-4fea-982c-6203ba438008-cnibin\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.110922 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9e1879ed-280f-4fea-982c-6203ba438008-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.111008 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqvr9\" (UniqueName: \"kubernetes.io/projected/9e1879ed-280f-4fea-982c-6203ba438008-kube-api-access-dqvr9\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.111067 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9e1879ed-280f-4fea-982c-6203ba438008-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.111100 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9e1879ed-280f-4fea-982c-6203ba438008-os-release\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.111140 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e1879ed-280f-4fea-982c-6203ba438008-system-cni-dir\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.111176 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9e1879ed-280f-4fea-982c-6203ba438008-cni-binary-copy\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.118162 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.134106 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.156837 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.171636 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.185866 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.199109 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.211992 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9e1879ed-280f-4fea-982c-6203ba438008-cnibin\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.212089 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9e1879ed-280f-4fea-982c-6203ba438008-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.212122 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqvr9\" (UniqueName: \"kubernetes.io/projected/9e1879ed-280f-4fea-982c-6203ba438008-kube-api-access-dqvr9\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.212147 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9e1879ed-280f-4fea-982c-6203ba438008-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.212179 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/9e1879ed-280f-4fea-982c-6203ba438008-cnibin\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.212185 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9e1879ed-280f-4fea-982c-6203ba438008-os-release\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.212245 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e1879ed-280f-4fea-982c-6203ba438008-system-cni-dir\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.212285 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9e1879ed-280f-4fea-982c-6203ba438008-cni-binary-copy\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.212418 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/9e1879ed-280f-4fea-982c-6203ba438008-os-release\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.212815 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/9e1879ed-280f-4fea-982c-6203ba438008-tuning-conf-dir\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.212811 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/9e1879ed-280f-4fea-982c-6203ba438008-system-cni-dir\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.212966 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/9e1879ed-280f-4fea-982c-6203ba438008-cni-binary-copy\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.213700 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/9e1879ed-280f-4fea-982c-6203ba438008-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.214221 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.228591 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.230614 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqvr9\" (UniqueName: \"kubernetes.io/projected/9e1879ed-280f-4fea-982c-6203ba438008-kube-api-access-dqvr9\") pod \"multus-additional-cni-plugins-gb57w\" (UID: \"9e1879ed-280f-4fea-982c-6203ba438008\") " pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.242465 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.325638 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-gb57w" Jan 27 15:47:37 crc kubenswrapper[4713]: W0127 15:47:37.337272 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e1879ed_280f_4fea_982c_6203ba438008.slice/crio-a96ed023030ef83c33864878188f7b82f475ae79fc520f24fb4e282153f175f7 WatchSource:0}: Error finding container a96ed023030ef83c33864878188f7b82f475ae79fc520f24fb4e282153f175f7: Status 404 returned error can't find the container with id a96ed023030ef83c33864878188f7b82f475ae79fc520f24fb4e282153f175f7 Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.378783 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xs9tk"] Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.379734 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.383270 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.383306 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.383329 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.383491 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.383659 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.383762 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.383841 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.398868 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.414810 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.430700 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.450892 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.479642 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.496312 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.509156 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.514540 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-node-log\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.514584 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-env-overrides\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.514612 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-var-lib-openvswitch\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.514636 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-ovnkube-script-lib\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.514694 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-log-socket\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.514732 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-cni-netd\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.514799 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-systemd\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.514841 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-kubelet\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.514862 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-slash\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.514880 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-ovn\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.514900 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/131f5d56-4900-4558-abfa-24c9e999e5ad-ovn-node-metrics-cert\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.514921 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-systemd-units\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.514971 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-cni-bin\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.515002 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.515023 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-ovnkube-config\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.515061 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-openvswitch\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.515088 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5w9kd\" (UniqueName: \"kubernetes.io/projected/131f5d56-4900-4558-abfa-24c9e999e5ad-kube-api-access-5w9kd\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.515122 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-run-netns\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.515148 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-etc-openvswitch\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.515170 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-run-ovn-kubernetes\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.530270 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.546520 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.560535 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.573266 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.586595 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.597102 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.608680 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:37Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616084 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-node-log\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616126 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-env-overrides\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616140 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-ovnkube-script-lib\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616160 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-var-lib-openvswitch\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616184 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-log-socket\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616198 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-cni-netd\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616237 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-cni-netd\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616269 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-var-lib-openvswitch\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616292 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-node-log\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616276 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-log-socket\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616329 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-systemd\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616350 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/131f5d56-4900-4558-abfa-24c9e999e5ad-ovn-node-metrics-cert\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616384 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-kubelet\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616405 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-slash\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616425 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-ovn\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616449 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-ovnkube-config\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616453 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-kubelet\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616498 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-ovn\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616498 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-systemd\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616578 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-systemd-units\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616469 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-systemd-units\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616639 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-slash\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616663 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-cni-bin\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616839 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616687 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-cni-bin\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616878 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-openvswitch\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616909 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5w9kd\" (UniqueName: \"kubernetes.io/projected/131f5d56-4900-4558-abfa-24c9e999e5ad-kube-api-access-5w9kd\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616912 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616938 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-run-netns\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616958 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-openvswitch\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.616966 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-etc-openvswitch\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.617001 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-etc-openvswitch\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.617019 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-run-netns\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.617025 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-run-ovn-kubernetes\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.617007 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-run-ovn-kubernetes\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.617388 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-env-overrides\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.617539 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-ovnkube-config\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.617988 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-ovnkube-script-lib\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.619843 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/131f5d56-4900-4558-abfa-24c9e999e5ad-ovn-node-metrics-cert\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.632983 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5w9kd\" (UniqueName: \"kubernetes.io/projected/131f5d56-4900-4558-abfa-24c9e999e5ad-kube-api-access-5w9kd\") pod \"ovnkube-node-xs9tk\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.695697 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.858088 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 01:12:33.251551687 +0000 UTC Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.898566 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:37 crc kubenswrapper[4713]: E0127 15:47:37.898739 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.898566 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:37 crc kubenswrapper[4713]: E0127 15:47:37.899142 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:37 crc kubenswrapper[4713]: I0127 15:47:37.899178 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:37 crc kubenswrapper[4713]: E0127 15:47:37.899253 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.003281 4713 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.005095 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.005146 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.005157 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.005304 4713 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.014548 4713 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.014956 4713 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.016399 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.016442 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.016454 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.016474 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.016489 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:38Z","lastTransitionTime":"2026-01-27T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:38 crc kubenswrapper[4713]: E0127 15:47:38.038361 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.043204 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.043253 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.043265 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.043284 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.043295 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:38Z","lastTransitionTime":"2026-01-27T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:38 crc kubenswrapper[4713]: E0127 15:47:38.058335 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.063063 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.063113 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.063125 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.063145 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.063164 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:38Z","lastTransitionTime":"2026-01-27T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.065386 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerStarted","Data":"9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.065439 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerStarted","Data":"6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.067942 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n7wxq" event={"ID":"bf07c585-f90e-4416-a66c-d41547008320","Type":"ContainerStarted","Data":"d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.070525 4713 generic.go:334] "Generic (PLEG): container finished" podID="9e1879ed-280f-4fea-982c-6203ba438008" containerID="7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d" exitCode=0 Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.070639 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" event={"ID":"9e1879ed-280f-4fea-982c-6203ba438008","Type":"ContainerDied","Data":"7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.070746 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" event={"ID":"9e1879ed-280f-4fea-982c-6203ba438008","Type":"ContainerStarted","Data":"a96ed023030ef83c33864878188f7b82f475ae79fc520f24fb4e282153f175f7"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.072456 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-j5sgs" event={"ID":"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55","Type":"ContainerStarted","Data":"f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.074859 4713 generic.go:334] "Generic (PLEG): container finished" podID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerID="e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da" exitCode=0 Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.074960 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerDied","Data":"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.075028 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerStarted","Data":"e03bc59b04541a6766bd487d7d12b47cf4d7da2b78b65f45ff4c8b9f05c011d2"} Jan 27 15:47:38 crc kubenswrapper[4713]: E0127 15:47:38.080892 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.086422 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.086495 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.086510 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.086536 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.086558 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:38Z","lastTransitionTime":"2026-01-27T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.086713 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: E0127 15:47:38.103105 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.107339 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.109787 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.109843 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.109858 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.109879 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.109892 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:38Z","lastTransitionTime":"2026-01-27T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.122486 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: E0127 15:47:38.123836 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: E0127 15:47:38.124155 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.126423 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.126561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.126653 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.126746 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.126828 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:38Z","lastTransitionTime":"2026-01-27T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.160281 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.179171 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.195494 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.210095 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.229294 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.230656 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.230698 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.230712 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.230731 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.230743 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:38Z","lastTransitionTime":"2026-01-27T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.243587 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.260417 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.284313 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.297566 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.309198 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.325599 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.335091 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.335159 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.335171 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.335196 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.335211 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:38Z","lastTransitionTime":"2026-01-27T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.347514 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.361559 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.376865 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.390843 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.406099 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.416877 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.429026 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.438391 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.438445 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.438458 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.438480 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.438496 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:38Z","lastTransitionTime":"2026-01-27T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.444474 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.457691 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.473951 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.489435 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.509674 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.525057 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.544908 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.544967 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.544977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.544998 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.545363 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:38Z","lastTransitionTime":"2026-01-27T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.552693 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:38Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.648341 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.648395 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.648408 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.648427 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.648443 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:38Z","lastTransitionTime":"2026-01-27T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.751504 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.751846 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.751855 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.751871 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.751882 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:38Z","lastTransitionTime":"2026-01-27T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.854223 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.854270 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.854281 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.854301 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.854320 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:38Z","lastTransitionTime":"2026-01-27T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.858562 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 13:56:24.204479931 +0000 UTC Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.957683 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.957742 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.957757 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.957781 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:38 crc kubenswrapper[4713]: I0127 15:47:38.957798 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:38Z","lastTransitionTime":"2026-01-27T15:47:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.059972 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.060011 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.060022 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.060052 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.060065 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:39Z","lastTransitionTime":"2026-01-27T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.083160 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" event={"ID":"9e1879ed-280f-4fea-982c-6203ba438008","Type":"ContainerStarted","Data":"78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.089400 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerStarted","Data":"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.089453 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerStarted","Data":"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.089468 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerStarted","Data":"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.089480 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerStarted","Data":"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.089493 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerStarted","Data":"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.097860 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.113451 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.126437 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.140710 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.153518 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.163165 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.163206 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.163219 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.163234 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.163245 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:39Z","lastTransitionTime":"2026-01-27T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.176612 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.190518 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.203412 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.217273 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.230313 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.244508 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.261802 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.265794 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.265850 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.265865 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.265885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.265899 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:39Z","lastTransitionTime":"2026-01-27T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.280447 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.293622 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.368298 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.368578 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.368689 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.368776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.368861 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:39Z","lastTransitionTime":"2026-01-27T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.471951 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.472000 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.472012 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.472056 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.472071 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:39Z","lastTransitionTime":"2026-01-27T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.535686 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.535880 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:47:47.53584083 +0000 UTC m=+35.314050768 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.574922 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.574982 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.574993 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.575012 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.575023 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:39Z","lastTransitionTime":"2026-01-27T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.601558 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-4l8xv"] Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.602015 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4l8xv" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.604129 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.604238 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.604820 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.604853 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.626234 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.637777 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.637863 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.637897 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.637935 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.637954 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.638099 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.638109 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:47.638077359 +0000 UTC m=+35.416287367 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.638126 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.638143 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.638099 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.638202 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.638207 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:47.638185703 +0000 UTC m=+35.416395641 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.638214 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.638235 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.638252 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:47.638239224 +0000 UTC m=+35.416449252 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.638370 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:47.638348328 +0000 UTC m=+35.416558286 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.641888 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.654901 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.666482 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.677007 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.677636 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.677688 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.677703 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.677727 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.677745 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:39Z","lastTransitionTime":"2026-01-27T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.690132 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.699029 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.709744 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.722198 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.734792 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.738540 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8bdd78db-909c-421c-a332-af38a8a6ba00-serviceca\") pod \"node-ca-4l8xv\" (UID: \"8bdd78db-909c-421c-a332-af38a8a6ba00\") " pod="openshift-image-registry/node-ca-4l8xv" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.738588 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bdd78db-909c-421c-a332-af38a8a6ba00-host\") pod \"node-ca-4l8xv\" (UID: \"8bdd78db-909c-421c-a332-af38a8a6ba00\") " pod="openshift-image-registry/node-ca-4l8xv" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.738607 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qltjf\" (UniqueName: \"kubernetes.io/projected/8bdd78db-909c-421c-a332-af38a8a6ba00-kube-api-access-qltjf\") pod \"node-ca-4l8xv\" (UID: \"8bdd78db-909c-421c-a332-af38a8a6ba00\") " pod="openshift-image-registry/node-ca-4l8xv" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.751308 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.769425 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.779934 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.779995 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.780008 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.780029 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.780059 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:39Z","lastTransitionTime":"2026-01-27T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.787341 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.801235 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.814648 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.839516 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8bdd78db-909c-421c-a332-af38a8a6ba00-serviceca\") pod \"node-ca-4l8xv\" (UID: \"8bdd78db-909c-421c-a332-af38a8a6ba00\") " pod="openshift-image-registry/node-ca-4l8xv" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.839658 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bdd78db-909c-421c-a332-af38a8a6ba00-host\") pod \"node-ca-4l8xv\" (UID: \"8bdd78db-909c-421c-a332-af38a8a6ba00\") " pod="openshift-image-registry/node-ca-4l8xv" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.839683 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qltjf\" (UniqueName: \"kubernetes.io/projected/8bdd78db-909c-421c-a332-af38a8a6ba00-kube-api-access-qltjf\") pod \"node-ca-4l8xv\" (UID: \"8bdd78db-909c-421c-a332-af38a8a6ba00\") " pod="openshift-image-registry/node-ca-4l8xv" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.839771 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/8bdd78db-909c-421c-a332-af38a8a6ba00-host\") pod \"node-ca-4l8xv\" (UID: \"8bdd78db-909c-421c-a332-af38a8a6ba00\") " pod="openshift-image-registry/node-ca-4l8xv" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.840792 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/8bdd78db-909c-421c-a332-af38a8a6ba00-serviceca\") pod \"node-ca-4l8xv\" (UID: \"8bdd78db-909c-421c-a332-af38a8a6ba00\") " pod="openshift-image-registry/node-ca-4l8xv" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.859074 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 19:31:15.938412401 +0000 UTC Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.862635 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qltjf\" (UniqueName: \"kubernetes.io/projected/8bdd78db-909c-421c-a332-af38a8a6ba00-kube-api-access-qltjf\") pod \"node-ca-4l8xv\" (UID: \"8bdd78db-909c-421c-a332-af38a8a6ba00\") " pod="openshift-image-registry/node-ca-4l8xv" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.882884 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.882946 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.882963 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.882992 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.883014 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:39Z","lastTransitionTime":"2026-01-27T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.899225 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.899225 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.899393 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.899250 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.899600 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:39 crc kubenswrapper[4713]: E0127 15:47:39.899593 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.915181 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-4l8xv" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.985325 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.985367 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.985380 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.985398 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:39 crc kubenswrapper[4713]: I0127 15:47:39.985412 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:39Z","lastTransitionTime":"2026-01-27T15:47:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.087614 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.087656 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.087668 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.087686 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.087700 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:40Z","lastTransitionTime":"2026-01-27T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.095158 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerStarted","Data":"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b"} Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.095989 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4l8xv" event={"ID":"8bdd78db-909c-421c-a332-af38a8a6ba00","Type":"ContainerStarted","Data":"109e9efc4e754afed2bee94ea896a581ef2f8bbeb753b4ca36e971f3f5b88869"} Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.097336 4713 generic.go:334] "Generic (PLEG): container finished" podID="9e1879ed-280f-4fea-982c-6203ba438008" containerID="78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b" exitCode=0 Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.097375 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" event={"ID":"9e1879ed-280f-4fea-982c-6203ba438008","Type":"ContainerDied","Data":"78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b"} Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.114124 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.126521 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.139435 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.152088 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.167416 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.183676 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.192629 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.192669 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.192681 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.192698 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.192710 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:40Z","lastTransitionTime":"2026-01-27T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.203619 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.222665 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.237673 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.253190 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.276632 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.290722 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.296792 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.296833 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.296842 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.296861 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.296873 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:40Z","lastTransitionTime":"2026-01-27T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.328120 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.368120 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.399905 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.399955 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.399967 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.399992 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.400004 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:40Z","lastTransitionTime":"2026-01-27T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.410014 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:40Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.503268 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.503312 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.503323 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.503342 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.503355 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:40Z","lastTransitionTime":"2026-01-27T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.605612 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.605663 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.605672 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.605691 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.605701 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:40Z","lastTransitionTime":"2026-01-27T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.708295 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.708360 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.708384 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.708410 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.708426 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:40Z","lastTransitionTime":"2026-01-27T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.811929 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.811992 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.812007 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.812028 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.812062 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:40Z","lastTransitionTime":"2026-01-27T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.859952 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 01:24:36.476432607 +0000 UTC Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.915881 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.915934 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.915943 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.915961 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:40 crc kubenswrapper[4713]: I0127 15:47:40.915972 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:40Z","lastTransitionTime":"2026-01-27T15:47:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.019151 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.019187 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.019195 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.019212 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.019222 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:41Z","lastTransitionTime":"2026-01-27T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.103760 4713 generic.go:334] "Generic (PLEG): container finished" podID="9e1879ed-280f-4fea-982c-6203ba438008" containerID="8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d" exitCode=0 Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.103834 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" event={"ID":"9e1879ed-280f-4fea-982c-6203ba438008","Type":"ContainerDied","Data":"8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d"} Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.105863 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-4l8xv" event={"ID":"8bdd78db-909c-421c-a332-af38a8a6ba00","Type":"ContainerStarted","Data":"39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d"} Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.121349 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.121399 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.121414 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.121433 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.121445 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:41Z","lastTransitionTime":"2026-01-27T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.136951 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.153509 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.171159 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.183960 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.197544 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.212652 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.223790 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.223827 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.223840 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.223858 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.223872 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:41Z","lastTransitionTime":"2026-01-27T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.227410 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.239021 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.250471 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.264864 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.277749 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.289362 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.305425 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.319368 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.325942 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.325964 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.325973 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.326004 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.326015 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:41Z","lastTransitionTime":"2026-01-27T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.332817 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.352318 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.365577 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.376222 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.385893 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.395996 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.407965 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.418704 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.428704 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.428756 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.428774 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.428795 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.428809 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:41Z","lastTransitionTime":"2026-01-27T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.433639 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.447183 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.458861 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.470932 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.488906 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.531667 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.531710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.531718 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.531736 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.531748 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:41Z","lastTransitionTime":"2026-01-27T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.532189 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.578695 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.610542 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:41Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.634993 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.635057 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.635073 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.635092 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.635106 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:41Z","lastTransitionTime":"2026-01-27T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.738161 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.738214 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.738226 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.738245 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.738259 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:41Z","lastTransitionTime":"2026-01-27T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.840931 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.840980 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.840992 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.841012 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.841026 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:41Z","lastTransitionTime":"2026-01-27T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.860581 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 16:28:31.814444441 +0000 UTC Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.899169 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.899218 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.899301 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:41 crc kubenswrapper[4713]: E0127 15:47:41.899336 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:41 crc kubenswrapper[4713]: E0127 15:47:41.899409 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:41 crc kubenswrapper[4713]: E0127 15:47:41.899607 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.944550 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.944598 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.944607 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.944625 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:41 crc kubenswrapper[4713]: I0127 15:47:41.944638 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:41Z","lastTransitionTime":"2026-01-27T15:47:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.047693 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.047737 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.047747 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.047764 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.047774 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:42Z","lastTransitionTime":"2026-01-27T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.113079 4713 generic.go:334] "Generic (PLEG): container finished" podID="9e1879ed-280f-4fea-982c-6203ba438008" containerID="1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee" exitCode=0 Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.113155 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" event={"ID":"9e1879ed-280f-4fea-982c-6203ba438008","Type":"ContainerDied","Data":"1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee"} Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.120229 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerStarted","Data":"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c"} Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.143506 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.151153 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.151189 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.151197 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.151213 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.151223 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:42Z","lastTransitionTime":"2026-01-27T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.159937 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.176907 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.191896 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.204063 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.218335 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.232395 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.244981 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.254061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.254102 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.254114 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.254132 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.254143 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:42Z","lastTransitionTime":"2026-01-27T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.261031 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.276571 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.291591 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.308488 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.331886 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.346914 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.357080 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.357123 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.357132 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.357148 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.357159 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:42Z","lastTransitionTime":"2026-01-27T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.363073 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.461574 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.462199 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.462234 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.462260 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.462271 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:42Z","lastTransitionTime":"2026-01-27T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.565201 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.565249 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.565259 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.565276 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.565291 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:42Z","lastTransitionTime":"2026-01-27T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.667808 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.667850 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.667858 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.667878 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.667888 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:42Z","lastTransitionTime":"2026-01-27T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.695526 4713 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.770030 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.770091 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.770102 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.770121 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.770134 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:42Z","lastTransitionTime":"2026-01-27T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.861147 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 22:00:04.487251633 +0000 UTC Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.871928 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.871987 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.872004 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.872029 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.872096 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:42Z","lastTransitionTime":"2026-01-27T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.913799 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.931817 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.948154 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.963996 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.974274 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.974309 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.974317 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.974331 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.974342 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:42Z","lastTransitionTime":"2026-01-27T15:47:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:42 crc kubenswrapper[4713]: I0127 15:47:42.984976 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.005286 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.021399 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.043720 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.057865 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.059154 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.069927 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.077162 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.077193 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.077201 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.077218 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.077228 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:43Z","lastTransitionTime":"2026-01-27T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.085219 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.096062 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.108544 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.120289 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.126880 4713 generic.go:334] "Generic (PLEG): container finished" podID="9e1879ed-280f-4fea-982c-6203ba438008" containerID="f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da" exitCode=0 Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.127012 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" event={"ID":"9e1879ed-280f-4fea-982c-6203ba438008","Type":"ContainerDied","Data":"f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da"} Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.137578 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.153200 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.168502 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.179868 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.179913 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.179922 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.179949 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.179961 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:43Z","lastTransitionTime":"2026-01-27T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.184791 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.202362 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.221603 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.237131 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.252095 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.275418 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.282752 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.282791 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.282801 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.282818 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.282828 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:43Z","lastTransitionTime":"2026-01-27T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.290458 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.304312 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.320068 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.331566 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.345577 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.367132 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.385439 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.385796 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.385880 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.385984 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.386096 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:43Z","lastTransitionTime":"2026-01-27T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.407653 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.488923 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.488984 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.488998 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.489024 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.489068 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:43Z","lastTransitionTime":"2026-01-27T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.592361 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.592422 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.592434 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.592456 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.592471 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:43Z","lastTransitionTime":"2026-01-27T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.695895 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.695944 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.695957 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.695976 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.695989 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:43Z","lastTransitionTime":"2026-01-27T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.799291 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.799346 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.799363 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.799386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.799401 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:43Z","lastTransitionTime":"2026-01-27T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.861494 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 12:29:46.988807627 +0000 UTC Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.899336 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.899395 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:43 crc kubenswrapper[4713]: E0127 15:47:43.899571 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.899632 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:43 crc kubenswrapper[4713]: E0127 15:47:43.899757 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:43 crc kubenswrapper[4713]: E0127 15:47:43.899869 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.902111 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.902151 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.902161 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.902180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:43 crc kubenswrapper[4713]: I0127 15:47:43.902235 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:43Z","lastTransitionTime":"2026-01-27T15:47:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.005238 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.005292 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.005301 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.005327 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.005351 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:44Z","lastTransitionTime":"2026-01-27T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.108003 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.108065 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.108078 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.108096 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.108109 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:44Z","lastTransitionTime":"2026-01-27T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.138387 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerStarted","Data":"b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b"} Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.138728 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.138765 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.144564 4713 generic.go:334] "Generic (PLEG): container finished" podID="9e1879ed-280f-4fea-982c-6203ba438008" containerID="9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba" exitCode=0 Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.144644 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" event={"ID":"9e1879ed-280f-4fea-982c-6203ba438008","Type":"ContainerDied","Data":"9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba"} Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.156220 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.168856 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.183281 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.196868 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.200115 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.200376 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.210824 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.210871 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.210884 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.210903 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.210915 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:44Z","lastTransitionTime":"2026-01-27T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.216015 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.234541 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.251713 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.277968 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.293112 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.307608 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.313961 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.314007 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.314021 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.314059 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.314070 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:44Z","lastTransitionTime":"2026-01-27T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.328736 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.346629 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.362589 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.376868 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.388779 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.402835 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.414545 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.416185 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.416233 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.416243 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.416262 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.416273 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:44Z","lastTransitionTime":"2026-01-27T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.433305 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.448108 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.463874 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.479174 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.492262 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.507743 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.519357 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.519788 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.519798 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.519817 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.519828 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:44Z","lastTransitionTime":"2026-01-27T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.520586 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.534921 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.548915 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.563258 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.579586 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.594867 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.614327 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.622524 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.622569 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.622579 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.622598 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.622615 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:44Z","lastTransitionTime":"2026-01-27T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.725437 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.725475 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.725484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.725499 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.725525 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:44Z","lastTransitionTime":"2026-01-27T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.828462 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.828511 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.828521 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.828541 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.828552 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:44Z","lastTransitionTime":"2026-01-27T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.862506 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 00:00:35.154160404 +0000 UTC Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.931549 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.931593 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.931602 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.931618 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:44 crc kubenswrapper[4713]: I0127 15:47:44.931628 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:44Z","lastTransitionTime":"2026-01-27T15:47:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.035009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.035092 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.035104 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.035123 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.035135 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:45Z","lastTransitionTime":"2026-01-27T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.138106 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.138143 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.138153 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.138170 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.138181 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:45Z","lastTransitionTime":"2026-01-27T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.152721 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" event={"ID":"9e1879ed-280f-4fea-982c-6203ba438008","Type":"ContainerStarted","Data":"5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9"} Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.152812 4713 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.173549 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.185621 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.197713 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.210023 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.221634 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.237267 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.241428 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.241485 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.241499 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.241522 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.241535 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:45Z","lastTransitionTime":"2026-01-27T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.249579 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.265241 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.280344 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.295375 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.315251 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.335825 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.343615 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.343669 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.343684 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.343706 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.343720 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:45Z","lastTransitionTime":"2026-01-27T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.358002 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.373804 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.390352 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.446842 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.446885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.446898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.446948 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.446983 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:45Z","lastTransitionTime":"2026-01-27T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.550668 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.550715 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.550725 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.550740 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.550751 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:45Z","lastTransitionTime":"2026-01-27T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.653609 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.653697 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.653716 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.653747 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.653771 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:45Z","lastTransitionTime":"2026-01-27T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.755627 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.755675 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.755686 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.755704 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.755714 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:45Z","lastTransitionTime":"2026-01-27T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.858280 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.858326 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.858337 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.858353 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.858362 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:45Z","lastTransitionTime":"2026-01-27T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.863570 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 22:19:47.419225799 +0000 UTC Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.898465 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.898572 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:45 crc kubenswrapper[4713]: E0127 15:47:45.898624 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:45 crc kubenswrapper[4713]: E0127 15:47:45.898746 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.898901 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:45 crc kubenswrapper[4713]: E0127 15:47:45.899000 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.961176 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.961214 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.961224 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.961240 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:45 crc kubenswrapper[4713]: I0127 15:47:45.961250 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:45Z","lastTransitionTime":"2026-01-27T15:47:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.067608 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.067683 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.067693 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.067725 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.067738 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:46Z","lastTransitionTime":"2026-01-27T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.155887 4713 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.170735 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.170791 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.170804 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.170824 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.170838 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:46Z","lastTransitionTime":"2026-01-27T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.273995 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.274332 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.274438 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.274527 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.274589 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:46Z","lastTransitionTime":"2026-01-27T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.377558 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.377861 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.377925 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.377993 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.378076 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:46Z","lastTransitionTime":"2026-01-27T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.482067 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.482423 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.482487 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.482569 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.482643 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:46Z","lastTransitionTime":"2026-01-27T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.586180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.586548 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.586688 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.586774 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.586849 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:46Z","lastTransitionTime":"2026-01-27T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.689828 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.689875 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.689885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.689904 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.689914 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:46Z","lastTransitionTime":"2026-01-27T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.792791 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.792838 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.792851 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.792874 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.792888 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:46Z","lastTransitionTime":"2026-01-27T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.863808 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 09:39:07.919540834 +0000 UTC Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.895785 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.895822 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.895831 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.895846 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.895857 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:46Z","lastTransitionTime":"2026-01-27T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.998594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.998638 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.998677 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.998702 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:46 crc kubenswrapper[4713]: I0127 15:47:46.998715 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:46Z","lastTransitionTime":"2026-01-27T15:47:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.102385 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.102437 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.102446 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.102464 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.102475 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:47Z","lastTransitionTime":"2026-01-27T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.162080 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/0.log" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.165787 4713 generic.go:334] "Generic (PLEG): container finished" podID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerID="b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b" exitCode=1 Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.165837 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerDied","Data":"b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b"} Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.166574 4713 scope.go:117] "RemoveContainer" containerID="b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.186432 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.203433 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.205442 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.205490 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.205498 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.205516 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.205527 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:47Z","lastTransitionTime":"2026-01-27T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.217606 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.231251 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.244297 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.257251 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.272617 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.287022 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.305192 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.308083 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.308149 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.308166 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.308191 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.308213 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:47Z","lastTransitionTime":"2026-01-27T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.321914 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.336018 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.355881 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.382942 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:46Z\\\",\\\"message\\\":\\\"er 4 for removal\\\\nI0127 15:47:46.851478 6032 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 15:47:46.851487 6032 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0127 15:47:46.851488 6032 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0127 15:47:46.851517 6032 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 15:47:46.851519 6032 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0127 15:47:46.851533 6032 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 15:47:46.851566 6032 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 15:47:46.851981 6032 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 15:47:46.852000 6032 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 15:47:46.852023 6032 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 15:47:46.852060 6032 factory.go:656] Stopping watch factory\\\\nI0127 15:47:46.852078 6032 ovnkube.go:599] Stopped ovnkube\\\\nI0127 15:47:46.852117 6032 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 15:47:46.852130 6032 handler.go:208] Removed *v1.Node event handler 7\\\\nI0127 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.397618 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.412600 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.413101 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.413122 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.413140 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.413151 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:47Z","lastTransitionTime":"2026-01-27T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.414610 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:47Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.515982 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.516024 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.516054 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.516072 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.516083 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:47Z","lastTransitionTime":"2026-01-27T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.617832 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.618160 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:48:03.618118199 +0000 UTC m=+51.396328177 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.619114 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.619156 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.619166 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.619183 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.619195 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:47Z","lastTransitionTime":"2026-01-27T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.719164 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.719216 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.719247 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.719269 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.719395 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.719451 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:48:03.719437131 +0000 UTC m=+51.497647069 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.719488 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.719543 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.719560 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.719602 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.719654 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:48:03.719624307 +0000 UTC m=+51.497834245 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.719698 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:48:03.719686069 +0000 UTC m=+51.497896227 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.719753 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.719769 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.719780 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.719824 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:48:03.719813793 +0000 UTC m=+51.498023941 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.721373 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.721428 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.721450 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.721470 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.721483 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:47Z","lastTransitionTime":"2026-01-27T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.824106 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.824154 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.824163 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.824180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.824191 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:47Z","lastTransitionTime":"2026-01-27T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.864013 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 09:46:17.536297906 +0000 UTC Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.898578 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.898629 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.898657 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.898748 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.898834 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:47 crc kubenswrapper[4713]: E0127 15:47:47.898920 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.927743 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.927792 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.927802 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.927826 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:47 crc kubenswrapper[4713]: I0127 15:47:47.927837 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:47Z","lastTransitionTime":"2026-01-27T15:47:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.031335 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.031390 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.031409 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.031431 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.031444 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:48Z","lastTransitionTime":"2026-01-27T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.134893 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.134953 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.134967 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.134986 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.134999 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:48Z","lastTransitionTime":"2026-01-27T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.170561 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/0.log" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.173369 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerStarted","Data":"1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.173543 4713 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.201256 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.217562 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.235026 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.237636 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.237677 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.237690 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.237709 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.237724 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:48Z","lastTransitionTime":"2026-01-27T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.253300 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.267249 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.283254 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.296342 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.301567 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.301625 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.301637 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.301658 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.301671 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:48Z","lastTransitionTime":"2026-01-27T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.312407 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: E0127 15:47:48.314073 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.318012 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.318077 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.318091 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.318111 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.318125 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:48Z","lastTransitionTime":"2026-01-27T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.328635 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: E0127 15:47:48.331562 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.337230 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.337281 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.337294 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.337313 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.337328 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:48Z","lastTransitionTime":"2026-01-27T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.341911 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: E0127 15:47:48.353266 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.356967 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.357022 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.357058 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.357084 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.357098 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:48Z","lastTransitionTime":"2026-01-27T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.357626 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: E0127 15:47:48.370783 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.373347 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.375293 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.375340 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.375355 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.375377 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.375391 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:48Z","lastTransitionTime":"2026-01-27T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: E0127 15:47:48.387118 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: E0127 15:47:48.387251 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.389322 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.389367 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.389381 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.389397 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.389409 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:48Z","lastTransitionTime":"2026-01-27T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.391353 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:46Z\\\",\\\"message\\\":\\\"er 4 for removal\\\\nI0127 15:47:46.851478 6032 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 15:47:46.851487 6032 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0127 15:47:46.851488 6032 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0127 15:47:46.851517 6032 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 15:47:46.851519 6032 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0127 15:47:46.851533 6032 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 15:47:46.851566 6032 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 15:47:46.851981 6032 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 15:47:46.852000 6032 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 15:47:46.852023 6032 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 15:47:46.852060 6032 factory.go:656] Stopping watch factory\\\\nI0127 15:47:46.852078 6032 ovnkube.go:599] Stopped ovnkube\\\\nI0127 15:47:46.852117 6032 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 15:47:46.852130 6032 handler.go:208] Removed *v1.Node event handler 7\\\\nI0127 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.407719 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.424693 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.492529 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.492573 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.492583 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.492599 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.492610 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:48Z","lastTransitionTime":"2026-01-27T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.595806 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.596116 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.596198 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.596277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.596347 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:48Z","lastTransitionTime":"2026-01-27T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.700105 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.700152 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.700164 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.700186 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.700201 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:48Z","lastTransitionTime":"2026-01-27T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.804562 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.804618 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.804628 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.804649 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.804669 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:48Z","lastTransitionTime":"2026-01-27T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.864600 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 01:32:21.056602581 +0000 UTC Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.904756 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm"] Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.906233 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.907329 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.907362 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.907373 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.907390 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.907403 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:48Z","lastTransitionTime":"2026-01-27T15:47:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.909408 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.909567 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.929875 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.950176 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.964341 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.982018 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:48 crc kubenswrapper[4713]: I0127 15:47:48.997866 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:48Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.010432 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.010502 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.010512 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.010529 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.010542 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:49Z","lastTransitionTime":"2026-01-27T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.013282 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.024969 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.033781 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9p5cj\" (UniqueName: \"kubernetes.io/projected/6017cf1d-2f04-45b3-bc32-4a1cae590bc5-kube-api-access-9p5cj\") pod \"ovnkube-control-plane-749d76644c-cqmfm\" (UID: \"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.033847 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6017cf1d-2f04-45b3-bc32-4a1cae590bc5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cqmfm\" (UID: \"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.033881 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6017cf1d-2f04-45b3-bc32-4a1cae590bc5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cqmfm\" (UID: \"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.033918 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6017cf1d-2f04-45b3-bc32-4a1cae590bc5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cqmfm\" (UID: \"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.038839 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.050947 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.065355 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.081365 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.098005 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.113221 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.113275 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.113288 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.113310 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.113327 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:49Z","lastTransitionTime":"2026-01-27T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.114475 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.135311 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9p5cj\" (UniqueName: \"kubernetes.io/projected/6017cf1d-2f04-45b3-bc32-4a1cae590bc5-kube-api-access-9p5cj\") pod \"ovnkube-control-plane-749d76644c-cqmfm\" (UID: \"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.135356 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6017cf1d-2f04-45b3-bc32-4a1cae590bc5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cqmfm\" (UID: \"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.135384 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6017cf1d-2f04-45b3-bc32-4a1cae590bc5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cqmfm\" (UID: \"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.135402 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6017cf1d-2f04-45b3-bc32-4a1cae590bc5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cqmfm\" (UID: \"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.136252 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6017cf1d-2f04-45b3-bc32-4a1cae590bc5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-cqmfm\" (UID: \"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.136511 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6017cf1d-2f04-45b3-bc32-4a1cae590bc5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-cqmfm\" (UID: \"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.139223 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:46Z\\\",\\\"message\\\":\\\"er 4 for removal\\\\nI0127 15:47:46.851478 6032 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 15:47:46.851487 6032 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0127 15:47:46.851488 6032 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0127 15:47:46.851517 6032 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 15:47:46.851519 6032 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0127 15:47:46.851533 6032 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 15:47:46.851566 6032 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 15:47:46.851981 6032 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 15:47:46.852000 6032 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 15:47:46.852023 6032 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 15:47:46.852060 6032 factory.go:656] Stopping watch factory\\\\nI0127 15:47:46.852078 6032 ovnkube.go:599] Stopped ovnkube\\\\nI0127 15:47:46.852117 6032 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 15:47:46.852130 6032 handler.go:208] Removed *v1.Node event handler 7\\\\nI0127 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.144084 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6017cf1d-2f04-45b3-bc32-4a1cae590bc5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-cqmfm\" (UID: \"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.155530 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9p5cj\" (UniqueName: \"kubernetes.io/projected/6017cf1d-2f04-45b3-bc32-4a1cae590bc5-kube-api-access-9p5cj\") pod \"ovnkube-control-plane-749d76644c-cqmfm\" (UID: \"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.157407 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.174577 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.179521 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/1.log" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.180440 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/0.log" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.183729 4713 generic.go:334] "Generic (PLEG): container finished" podID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerID="1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373" exitCode=1 Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.183841 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerDied","Data":"1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373"} Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.183944 4713 scope.go:117] "RemoveContainer" containerID="b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.184834 4713 scope.go:117] "RemoveContainer" containerID="1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373" Jan 27 15:47:49 crc kubenswrapper[4713]: E0127 15:47:49.185055 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.202001 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.217125 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.217181 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.217193 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.217212 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.217230 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:49Z","lastTransitionTime":"2026-01-27T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.219282 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.219409 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.237557 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.259458 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:46Z\\\",\\\"message\\\":\\\"er 4 for removal\\\\nI0127 15:47:46.851478 6032 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 15:47:46.851487 6032 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0127 15:47:46.851488 6032 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0127 15:47:46.851517 6032 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 15:47:46.851519 6032 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0127 15:47:46.851533 6032 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 15:47:46.851566 6032 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 15:47:46.851981 6032 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 15:47:46.852000 6032 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 15:47:46.852023 6032 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 15:47:46.852060 6032 factory.go:656] Stopping watch factory\\\\nI0127 15:47:46.852078 6032 ovnkube.go:599] Stopped ovnkube\\\\nI0127 15:47:46.852117 6032 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 15:47:46.852130 6032 handler.go:208] Removed *v1.Node event handler 7\\\\nI0127 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"d/etcd-crc\\\\nI0127 15:47:48.246451 6164 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246457 6164 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0127 15:47:48.246461 6164 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0127 15:47:48.246465 6164 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246324 6164 services_controller.go:434] Service openshift-image-registry/image-registry retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{image-registry openshift-image-registry 8a06dbaa-f3c9-4dca-b7f2-c0a78edd88d0 19654 0 2025-02-24 06:08:59 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[docker-registry:default] map[imageregistry.operator.openshift.io/checksum:sha256:1c19715a76014ae1d56140d6390a08f14f453c1a59dc36c15718f40c638ef63d service.alpha.openshift.io/serving-cert-secret-name:image-registry-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:5000-tcp,Protocol:TCP,Port:5000,TargetPort:{0 5000 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{docker-registry: de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.273880 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.289631 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.303875 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.320094 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.320910 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.320965 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.320982 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.321003 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.321015 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:49Z","lastTransitionTime":"2026-01-27T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.336387 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.351141 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.367768 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.381475 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.401196 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.415005 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.424284 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.424329 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.424337 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.424357 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.424367 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:49Z","lastTransitionTime":"2026-01-27T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.427405 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.440375 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:49Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.527285 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.527338 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.527350 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.527370 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.527384 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:49Z","lastTransitionTime":"2026-01-27T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.630736 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.630780 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.630791 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.630808 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.630820 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:49Z","lastTransitionTime":"2026-01-27T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.733973 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.734025 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.734067 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.734087 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.734103 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:49Z","lastTransitionTime":"2026-01-27T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.837497 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.837552 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.837566 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.837587 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.837602 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:49Z","lastTransitionTime":"2026-01-27T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.865292 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 19:08:23.472003196 +0000 UTC Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.899059 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.899111 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.899058 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:49 crc kubenswrapper[4713]: E0127 15:47:49.899236 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:49 crc kubenswrapper[4713]: E0127 15:47:49.899428 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:49 crc kubenswrapper[4713]: E0127 15:47:49.899497 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.940238 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.940288 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.940298 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.940313 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:49 crc kubenswrapper[4713]: I0127 15:47:49.940324 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:49Z","lastTransitionTime":"2026-01-27T15:47:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.042831 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.042884 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.042897 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.042924 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.042937 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:50Z","lastTransitionTime":"2026-01-27T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.146812 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.146868 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.146879 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.146895 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.146907 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:50Z","lastTransitionTime":"2026-01-27T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.191490 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/1.log" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.197493 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" event={"ID":"6017cf1d-2f04-45b3-bc32-4a1cae590bc5","Type":"ContainerStarted","Data":"cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682"} Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.197554 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" event={"ID":"6017cf1d-2f04-45b3-bc32-4a1cae590bc5","Type":"ContainerStarted","Data":"ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b"} Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.197566 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" event={"ID":"6017cf1d-2f04-45b3-bc32-4a1cae590bc5","Type":"ContainerStarted","Data":"5228f1c1803d3da4c1cff4ac386b00e15ae4d8420027945e3987f8ff400fc7e2"} Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.212806 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.236428 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.250291 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.250352 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.250363 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.250385 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.250401 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:50Z","lastTransitionTime":"2026-01-27T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.257915 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.276068 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.292171 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.306522 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.320208 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.331337 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.348985 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:46Z\\\",\\\"message\\\":\\\"er 4 for removal\\\\nI0127 15:47:46.851478 6032 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 15:47:46.851487 6032 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0127 15:47:46.851488 6032 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0127 15:47:46.851517 6032 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 15:47:46.851519 6032 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0127 15:47:46.851533 6032 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 15:47:46.851566 6032 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 15:47:46.851981 6032 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 15:47:46.852000 6032 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 15:47:46.852023 6032 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 15:47:46.852060 6032 factory.go:656] Stopping watch factory\\\\nI0127 15:47:46.852078 6032 ovnkube.go:599] Stopped ovnkube\\\\nI0127 15:47:46.852117 6032 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 15:47:46.852130 6032 handler.go:208] Removed *v1.Node event handler 7\\\\nI0127 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"d/etcd-crc\\\\nI0127 15:47:48.246451 6164 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246457 6164 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0127 15:47:48.246461 6164 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0127 15:47:48.246465 6164 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246324 6164 services_controller.go:434] Service openshift-image-registry/image-registry retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{image-registry openshift-image-registry 8a06dbaa-f3c9-4dca-b7f2-c0a78edd88d0 19654 0 2025-02-24 06:08:59 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[docker-registry:default] map[imageregistry.operator.openshift.io/checksum:sha256:1c19715a76014ae1d56140d6390a08f14f453c1a59dc36c15718f40c638ef63d service.alpha.openshift.io/serving-cert-secret-name:image-registry-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:5000-tcp,Protocol:TCP,Port:5000,TargetPort:{0 5000 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{docker-registry: de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.352910 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.352979 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.352988 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.353005 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.353017 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:50Z","lastTransitionTime":"2026-01-27T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.363992 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.368318 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-mdw5k"] Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.368851 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:50 crc kubenswrapper[4713]: E0127 15:47:50.368930 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.380249 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.394652 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.409122 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.426594 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.443248 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.450409 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbjvc\" (UniqueName: \"kubernetes.io/projected/81e4c47d-af8d-44f0-beff-17cf5f133ff7-kube-api-access-vbjvc\") pod \"network-metrics-daemon-mdw5k\" (UID: \"81e4c47d-af8d-44f0-beff-17cf5f133ff7\") " pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.450538 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs\") pod \"network-metrics-daemon-mdw5k\" (UID: \"81e4c47d-af8d-44f0-beff-17cf5f133ff7\") " pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.456191 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.456256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.456275 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.456301 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.456321 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:50Z","lastTransitionTime":"2026-01-27T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.456826 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.470697 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.484396 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.505302 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.521484 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.535825 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.548286 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.551848 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbjvc\" (UniqueName: \"kubernetes.io/projected/81e4c47d-af8d-44f0-beff-17cf5f133ff7-kube-api-access-vbjvc\") pod \"network-metrics-daemon-mdw5k\" (UID: \"81e4c47d-af8d-44f0-beff-17cf5f133ff7\") " pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.551958 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs\") pod \"network-metrics-daemon-mdw5k\" (UID: \"81e4c47d-af8d-44f0-beff-17cf5f133ff7\") " pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:50 crc kubenswrapper[4713]: E0127 15:47:50.552143 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:47:50 crc kubenswrapper[4713]: E0127 15:47:50.552220 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs podName:81e4c47d-af8d-44f0-beff-17cf5f133ff7 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:51.052201648 +0000 UTC m=+38.830411586 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs") pod "network-metrics-daemon-mdw5k" (UID: "81e4c47d-af8d-44f0-beff-17cf5f133ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.564670 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.564734 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.564748 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.564771 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.564786 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:50Z","lastTransitionTime":"2026-01-27T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.564958 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.577858 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbjvc\" (UniqueName: \"kubernetes.io/projected/81e4c47d-af8d-44f0-beff-17cf5f133ff7-kube-api-access-vbjvc\") pod \"network-metrics-daemon-mdw5k\" (UID: \"81e4c47d-af8d-44f0-beff-17cf5f133ff7\") " pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.577931 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.594897 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.609262 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.622953 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.636998 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.654742 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.667947 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.667993 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.668003 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.668022 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.668048 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:50Z","lastTransitionTime":"2026-01-27T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.668903 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.685379 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.704156 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:46Z\\\",\\\"message\\\":\\\"er 4 for removal\\\\nI0127 15:47:46.851478 6032 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 15:47:46.851487 6032 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0127 15:47:46.851488 6032 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0127 15:47:46.851517 6032 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 15:47:46.851519 6032 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0127 15:47:46.851533 6032 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 15:47:46.851566 6032 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 15:47:46.851981 6032 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 15:47:46.852000 6032 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 15:47:46.852023 6032 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 15:47:46.852060 6032 factory.go:656] Stopping watch factory\\\\nI0127 15:47:46.852078 6032 ovnkube.go:599] Stopped ovnkube\\\\nI0127 15:47:46.852117 6032 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 15:47:46.852130 6032 handler.go:208] Removed *v1.Node event handler 7\\\\nI0127 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"d/etcd-crc\\\\nI0127 15:47:48.246451 6164 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246457 6164 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0127 15:47:48.246461 6164 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0127 15:47:48.246465 6164 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246324 6164 services_controller.go:434] Service openshift-image-registry/image-registry retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{image-registry openshift-image-registry 8a06dbaa-f3c9-4dca-b7f2-c0a78edd88d0 19654 0 2025-02-24 06:08:59 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[docker-registry:default] map[imageregistry.operator.openshift.io/checksum:sha256:1c19715a76014ae1d56140d6390a08f14f453c1a59dc36c15718f40c638ef63d service.alpha.openshift.io/serving-cert-secret-name:image-registry-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:5000-tcp,Protocol:TCP,Port:5000,TargetPort:{0 5000 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{docker-registry: de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.714724 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:50Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.770384 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.770438 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.770450 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.770469 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.770483 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:50Z","lastTransitionTime":"2026-01-27T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.865771 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 18:06:59.413672214 +0000 UTC Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.873300 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.873411 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.873428 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.873451 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.873465 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:50Z","lastTransitionTime":"2026-01-27T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.976795 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.976878 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.976894 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.976913 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:50 crc kubenswrapper[4713]: I0127 15:47:50.976924 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:50Z","lastTransitionTime":"2026-01-27T15:47:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.057117 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs\") pod \"network-metrics-daemon-mdw5k\" (UID: \"81e4c47d-af8d-44f0-beff-17cf5f133ff7\") " pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:51 crc kubenswrapper[4713]: E0127 15:47:51.057325 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:47:51 crc kubenswrapper[4713]: E0127 15:47:51.057467 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs podName:81e4c47d-af8d-44f0-beff-17cf5f133ff7 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:52.057437215 +0000 UTC m=+39.835647213 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs") pod "network-metrics-daemon-mdw5k" (UID: "81e4c47d-af8d-44f0-beff-17cf5f133ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.079811 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.079867 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.079882 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.079920 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.079932 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:51Z","lastTransitionTime":"2026-01-27T15:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.182812 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.183379 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.183394 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.183417 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.183432 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:51Z","lastTransitionTime":"2026-01-27T15:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.285828 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.285869 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.285881 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.285898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.285910 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:51Z","lastTransitionTime":"2026-01-27T15:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.389830 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.389898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.389919 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.389950 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.389972 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:51Z","lastTransitionTime":"2026-01-27T15:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.492602 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.492650 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.492659 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.492677 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.492694 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:51Z","lastTransitionTime":"2026-01-27T15:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.594996 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.595058 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.595072 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.595091 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.595103 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:51Z","lastTransitionTime":"2026-01-27T15:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.698172 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.698211 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.698222 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.698238 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.698248 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:51Z","lastTransitionTime":"2026-01-27T15:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.801726 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.801766 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.801776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.801792 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.801804 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:51Z","lastTransitionTime":"2026-01-27T15:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.866128 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 16:19:37.892832852 +0000 UTC Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.899227 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.899263 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.899313 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.899429 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:51 crc kubenswrapper[4713]: E0127 15:47:51.899600 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:47:51 crc kubenswrapper[4713]: E0127 15:47:51.899725 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:51 crc kubenswrapper[4713]: E0127 15:47:51.899888 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:51 crc kubenswrapper[4713]: E0127 15:47:51.900027 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.904390 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.904430 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.904441 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.904461 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:51 crc kubenswrapper[4713]: I0127 15:47:51.904473 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:51Z","lastTransitionTime":"2026-01-27T15:47:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.007464 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.007510 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.007518 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.007533 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.007544 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:52Z","lastTransitionTime":"2026-01-27T15:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.068405 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs\") pod \"network-metrics-daemon-mdw5k\" (UID: \"81e4c47d-af8d-44f0-beff-17cf5f133ff7\") " pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:52 crc kubenswrapper[4713]: E0127 15:47:52.068690 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:47:52 crc kubenswrapper[4713]: E0127 15:47:52.068821 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs podName:81e4c47d-af8d-44f0-beff-17cf5f133ff7 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:54.068784545 +0000 UTC m=+41.846994523 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs") pod "network-metrics-daemon-mdw5k" (UID: "81e4c47d-af8d-44f0-beff-17cf5f133ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.110897 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.110939 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.110947 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.110962 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.110972 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:52Z","lastTransitionTime":"2026-01-27T15:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.214464 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.214535 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.214569 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.214589 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.214600 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:52Z","lastTransitionTime":"2026-01-27T15:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.318656 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.318723 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.318736 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.318759 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.318778 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:52Z","lastTransitionTime":"2026-01-27T15:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.422352 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.422443 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.422458 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.422481 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.422494 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:52Z","lastTransitionTime":"2026-01-27T15:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.525462 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.525520 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.525535 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.525554 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.525566 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:52Z","lastTransitionTime":"2026-01-27T15:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.628516 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.628577 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.628591 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.628613 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.628629 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:52Z","lastTransitionTime":"2026-01-27T15:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.732032 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.732140 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.732166 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.732197 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.732219 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:52Z","lastTransitionTime":"2026-01-27T15:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.834869 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.834930 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.834945 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.834966 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.834979 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:52Z","lastTransitionTime":"2026-01-27T15:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.866543 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-25 02:39:08.63765573 +0000 UTC Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.916228 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:52Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.929394 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:52Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.938694 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.938797 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.938818 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.938851 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.938874 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:52Z","lastTransitionTime":"2026-01-27T15:47:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.942914 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:52Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.954654 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:52Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.972439 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:52Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:52 crc kubenswrapper[4713]: I0127 15:47:52.995272 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:52Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.016993 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:53Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.041576 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.041615 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.041623 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.041640 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.041649 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:53Z","lastTransitionTime":"2026-01-27T15:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.044842 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:53Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.068851 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:53Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.085716 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:53Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.099462 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:53Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.120563 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:53Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.143915 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b7ee51af25c81a5798660a0db7931f3165cf1765d21d841a81ec519ced02286b\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:46Z\\\",\\\"message\\\":\\\"er 4 for removal\\\\nI0127 15:47:46.851478 6032 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0127 15:47:46.851487 6032 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0127 15:47:46.851488 6032 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0127 15:47:46.851517 6032 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0127 15:47:46.851519 6032 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0127 15:47:46.851533 6032 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0127 15:47:46.851566 6032 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0127 15:47:46.851981 6032 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0127 15:47:46.852000 6032 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0127 15:47:46.852023 6032 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0127 15:47:46.852060 6032 factory.go:656] Stopping watch factory\\\\nI0127 15:47:46.852078 6032 ovnkube.go:599] Stopped ovnkube\\\\nI0127 15:47:46.852117 6032 handler.go:208] Removed *v1.Node event handler 2\\\\nI0127 15:47:46.852130 6032 handler.go:208] Removed *v1.Node event handler 7\\\\nI0127 1\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"d/etcd-crc\\\\nI0127 15:47:48.246451 6164 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246457 6164 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0127 15:47:48.246461 6164 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0127 15:47:48.246465 6164 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246324 6164 services_controller.go:434] Service openshift-image-registry/image-registry retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{image-registry openshift-image-registry 8a06dbaa-f3c9-4dca-b7f2-c0a78edd88d0 19654 0 2025-02-24 06:08:59 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[docker-registry:default] map[imageregistry.operator.openshift.io/checksum:sha256:1c19715a76014ae1d56140d6390a08f14f453c1a59dc36c15718f40c638ef63d service.alpha.openshift.io/serving-cert-secret-name:image-registry-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:5000-tcp,Protocol:TCP,Port:5000,TargetPort:{0 5000 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{docker-registry: de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:53Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.145736 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.145792 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.145802 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.145822 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.145832 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:53Z","lastTransitionTime":"2026-01-27T15:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.158488 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:53Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.179779 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:53Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.194076 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:53Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.209550 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:53Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.248366 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.248402 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.248410 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.248428 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.248438 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:53Z","lastTransitionTime":"2026-01-27T15:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.351574 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.351995 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.352129 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.352241 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.352337 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:53Z","lastTransitionTime":"2026-01-27T15:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.455117 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.455168 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.455180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.455198 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.455208 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:53Z","lastTransitionTime":"2026-01-27T15:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.558483 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.558551 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.558564 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.558583 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.558596 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:53Z","lastTransitionTime":"2026-01-27T15:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.661444 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.661505 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.661517 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.661537 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.661551 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:53Z","lastTransitionTime":"2026-01-27T15:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.764647 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.764709 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.764720 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.764740 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.764752 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:53Z","lastTransitionTime":"2026-01-27T15:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.866722 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 23:49:16.205141327 +0000 UTC Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.867973 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.868070 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.868085 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.868103 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.868115 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:53Z","lastTransitionTime":"2026-01-27T15:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.898609 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.898709 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:53 crc kubenswrapper[4713]: E0127 15:47:53.898812 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:47:53 crc kubenswrapper[4713]: E0127 15:47:53.898892 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.898992 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:53 crc kubenswrapper[4713]: E0127 15:47:53.899062 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.899111 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:53 crc kubenswrapper[4713]: E0127 15:47:53.899174 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.971679 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.971741 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.971754 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.971774 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:53 crc kubenswrapper[4713]: I0127 15:47:53.971787 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:53Z","lastTransitionTime":"2026-01-27T15:47:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.074941 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.075001 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.075010 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.075026 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.075065 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:54Z","lastTransitionTime":"2026-01-27T15:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.092573 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs\") pod \"network-metrics-daemon-mdw5k\" (UID: \"81e4c47d-af8d-44f0-beff-17cf5f133ff7\") " pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:54 crc kubenswrapper[4713]: E0127 15:47:54.092743 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:47:54 crc kubenswrapper[4713]: E0127 15:47:54.092820 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs podName:81e4c47d-af8d-44f0-beff-17cf5f133ff7 nodeName:}" failed. No retries permitted until 2026-01-27 15:47:58.092796304 +0000 UTC m=+45.871006242 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs") pod "network-metrics-daemon-mdw5k" (UID: "81e4c47d-af8d-44f0-beff-17cf5f133ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.178062 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.178115 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.178123 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.178142 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.178159 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:54Z","lastTransitionTime":"2026-01-27T15:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.281121 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.281170 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.281180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.281198 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.281212 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:54Z","lastTransitionTime":"2026-01-27T15:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.384428 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.384482 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.384493 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.384510 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.384522 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:54Z","lastTransitionTime":"2026-01-27T15:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.488504 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.488587 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.488603 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.488631 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.488668 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:54Z","lastTransitionTime":"2026-01-27T15:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.591837 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.591906 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.591923 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.591943 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.591955 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:54Z","lastTransitionTime":"2026-01-27T15:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.695149 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.695194 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.695206 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.695223 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.695239 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:54Z","lastTransitionTime":"2026-01-27T15:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.797617 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.797685 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.797702 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.797732 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.797751 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:54Z","lastTransitionTime":"2026-01-27T15:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.868771 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 08:57:58.730399715 +0000 UTC Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.901264 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.901353 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.901372 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.901417 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:54 crc kubenswrapper[4713]: I0127 15:47:54.901432 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:54Z","lastTransitionTime":"2026-01-27T15:47:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.004708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.004772 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.004789 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.004817 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.004835 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:55Z","lastTransitionTime":"2026-01-27T15:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.107922 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.107984 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.107999 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.108023 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.108078 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:55Z","lastTransitionTime":"2026-01-27T15:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.211136 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.211185 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.211196 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.211212 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.211222 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:55Z","lastTransitionTime":"2026-01-27T15:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.314101 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.314186 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.314200 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.314222 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.314237 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:55Z","lastTransitionTime":"2026-01-27T15:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.417382 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.417422 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.417431 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.417446 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.417458 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:55Z","lastTransitionTime":"2026-01-27T15:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.519960 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.520011 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.520022 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.520069 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.520083 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:55Z","lastTransitionTime":"2026-01-27T15:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.629120 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.629187 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.629335 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.629392 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.629415 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:55Z","lastTransitionTime":"2026-01-27T15:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.731936 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.731984 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.731995 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.732014 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.732025 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:55Z","lastTransitionTime":"2026-01-27T15:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.834832 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.834919 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.834938 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.834966 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.834987 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:55Z","lastTransitionTime":"2026-01-27T15:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.869365 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 23:01:13.313633356 +0000 UTC Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.898988 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.899165 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.899465 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.899638 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:55 crc kubenswrapper[4713]: E0127 15:47:55.899649 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:47:55 crc kubenswrapper[4713]: E0127 15:47:55.899789 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:55 crc kubenswrapper[4713]: E0127 15:47:55.899830 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:55 crc kubenswrapper[4713]: E0127 15:47:55.899915 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.938115 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.938168 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.938184 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.938210 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:55 crc kubenswrapper[4713]: I0127 15:47:55.938228 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:55Z","lastTransitionTime":"2026-01-27T15:47:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.041207 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.041263 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.041278 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.041300 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.041316 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:56Z","lastTransitionTime":"2026-01-27T15:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.143887 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.143930 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.143939 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.143961 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.143972 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:56Z","lastTransitionTime":"2026-01-27T15:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.247084 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.247471 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.247572 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.247673 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.247759 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:56Z","lastTransitionTime":"2026-01-27T15:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.350148 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.350476 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.350624 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.350747 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.350827 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:56Z","lastTransitionTime":"2026-01-27T15:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.453491 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.453542 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.453556 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.453574 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.453585 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:56Z","lastTransitionTime":"2026-01-27T15:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.556663 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.556718 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.556727 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.556748 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.556760 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:56Z","lastTransitionTime":"2026-01-27T15:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.658967 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.659001 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.659009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.659025 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.659056 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:56Z","lastTransitionTime":"2026-01-27T15:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.762287 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.762602 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.762730 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.762851 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.763001 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:56Z","lastTransitionTime":"2026-01-27T15:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.809826 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.811272 4713 scope.go:117] "RemoveContainer" containerID="1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373" Jan 27 15:47:56 crc kubenswrapper[4713]: E0127 15:47:56.811515 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.836979 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:56Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.851890 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:56Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.866369 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.866415 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.866430 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.866450 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.866462 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:56Z","lastTransitionTime":"2026-01-27T15:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.867814 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:56Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.869855 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 11:07:07.870386614 +0000 UTC Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.880850 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:56Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.893722 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:56Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.908601 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:56Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.925016 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:56Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.941807 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:56Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.956451 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:56Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.969019 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.969114 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.969129 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.969154 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.969167 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:56Z","lastTransitionTime":"2026-01-27T15:47:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.973467 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:56Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:56 crc kubenswrapper[4713]: I0127 15:47:56.989586 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:56Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.006659 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:57Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.025749 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:57Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.045127 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"d/etcd-crc\\\\nI0127 15:47:48.246451 6164 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246457 6164 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0127 15:47:48.246461 6164 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0127 15:47:48.246465 6164 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246324 6164 services_controller.go:434] Service openshift-image-registry/image-registry retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{image-registry openshift-image-registry 8a06dbaa-f3c9-4dca-b7f2-c0a78edd88d0 19654 0 2025-02-24 06:08:59 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[docker-registry:default] map[imageregistry.operator.openshift.io/checksum:sha256:1c19715a76014ae1d56140d6390a08f14f453c1a59dc36c15718f40c638ef63d service.alpha.openshift.io/serving-cert-secret-name:image-registry-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:5000-tcp,Protocol:TCP,Port:5000,TargetPort:{0 5000 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{docker-registry: de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:57Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.059203 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:57Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.071756 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.071801 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.071809 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.071828 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.071838 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:57Z","lastTransitionTime":"2026-01-27T15:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.079580 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:57Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.095198 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:57Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.174755 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.174805 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.174816 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.174834 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.174847 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:57Z","lastTransitionTime":"2026-01-27T15:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.277218 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.277268 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.277280 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.277304 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.277322 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:57Z","lastTransitionTime":"2026-01-27T15:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.380571 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.380615 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.380627 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.380673 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.380697 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:57Z","lastTransitionTime":"2026-01-27T15:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.482852 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.482892 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.482903 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.482921 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.482933 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:57Z","lastTransitionTime":"2026-01-27T15:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.585772 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.585856 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.585882 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.585911 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.585933 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:57Z","lastTransitionTime":"2026-01-27T15:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.688920 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.688966 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.688977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.688995 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.689007 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:57Z","lastTransitionTime":"2026-01-27T15:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.792472 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.792508 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.792518 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.792533 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.792542 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:57Z","lastTransitionTime":"2026-01-27T15:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.870003 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 01:34:35.852454068 +0000 UTC Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.894927 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.894971 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.894980 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.894998 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.895011 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:57Z","lastTransitionTime":"2026-01-27T15:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.899244 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.899234 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:57 crc kubenswrapper[4713]: E0127 15:47:57.899381 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.899239 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.899240 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:57 crc kubenswrapper[4713]: E0127 15:47:57.899474 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:57 crc kubenswrapper[4713]: E0127 15:47:57.899554 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:57 crc kubenswrapper[4713]: E0127 15:47:57.899606 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.998193 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.998292 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.998341 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.998366 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:57 crc kubenswrapper[4713]: I0127 15:47:57.998378 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:57Z","lastTransitionTime":"2026-01-27T15:47:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.100748 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.100804 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.100820 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.100845 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.100861 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:58Z","lastTransitionTime":"2026-01-27T15:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.141480 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs\") pod \"network-metrics-daemon-mdw5k\" (UID: \"81e4c47d-af8d-44f0-beff-17cf5f133ff7\") " pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:58 crc kubenswrapper[4713]: E0127 15:47:58.141704 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:47:58 crc kubenswrapper[4713]: E0127 15:47:58.141846 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs podName:81e4c47d-af8d-44f0-beff-17cf5f133ff7 nodeName:}" failed. No retries permitted until 2026-01-27 15:48:06.141815215 +0000 UTC m=+53.920025163 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs") pod "network-metrics-daemon-mdw5k" (UID: "81e4c47d-af8d-44f0-beff-17cf5f133ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.204111 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.204177 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.204200 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.204225 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.204240 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:58Z","lastTransitionTime":"2026-01-27T15:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.306848 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.306906 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.306927 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.306954 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.306968 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:58Z","lastTransitionTime":"2026-01-27T15:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.410209 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.410263 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.410278 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.410300 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.410319 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:58Z","lastTransitionTime":"2026-01-27T15:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.467708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.467761 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.467772 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.467791 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.467804 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:58Z","lastTransitionTime":"2026-01-27T15:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:58 crc kubenswrapper[4713]: E0127 15:47:58.480637 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:58Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.485262 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.485308 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.485325 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.485348 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.485364 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:58Z","lastTransitionTime":"2026-01-27T15:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:58 crc kubenswrapper[4713]: E0127 15:47:58.498790 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:58Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.502104 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.502130 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.502139 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.502151 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.502161 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:58Z","lastTransitionTime":"2026-01-27T15:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:58 crc kubenswrapper[4713]: E0127 15:47:58.513537 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:58Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.516799 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.516844 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.516855 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.516876 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.516888 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:58Z","lastTransitionTime":"2026-01-27T15:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:58 crc kubenswrapper[4713]: E0127 15:47:58.529576 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:58Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.533634 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.533704 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.533719 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.533738 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.533750 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:58Z","lastTransitionTime":"2026-01-27T15:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:58 crc kubenswrapper[4713]: E0127 15:47:58.548123 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:47:58Z is after 2025-08-24T17:21:41Z" Jan 27 15:47:58 crc kubenswrapper[4713]: E0127 15:47:58.548441 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.550490 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.550552 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.550566 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.550588 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.550603 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:58Z","lastTransitionTime":"2026-01-27T15:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.653801 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.653889 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.653903 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.653923 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.653940 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:58Z","lastTransitionTime":"2026-01-27T15:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.757767 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.757831 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.757850 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.757874 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.757890 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:58Z","lastTransitionTime":"2026-01-27T15:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.866376 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.866427 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.866446 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.866482 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.866502 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:58Z","lastTransitionTime":"2026-01-27T15:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.870910 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 07:32:40.343948367 +0000 UTC Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.969873 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.970530 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.970837 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.971145 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:58 crc kubenswrapper[4713]: I0127 15:47:58.971567 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:58Z","lastTransitionTime":"2026-01-27T15:47:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.075558 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.075617 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.075629 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.075650 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.075661 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:59Z","lastTransitionTime":"2026-01-27T15:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.178379 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.178421 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.178431 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.178451 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.178461 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:59Z","lastTransitionTime":"2026-01-27T15:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.281498 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.281548 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.281563 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.281584 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.281605 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:59Z","lastTransitionTime":"2026-01-27T15:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.385236 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.385276 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.385285 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.385300 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.385311 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:59Z","lastTransitionTime":"2026-01-27T15:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.488442 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.488533 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.488557 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.488590 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.488612 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:59Z","lastTransitionTime":"2026-01-27T15:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.591178 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.591266 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.591279 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.591298 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.591309 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:59Z","lastTransitionTime":"2026-01-27T15:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.694525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.694580 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.694592 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.694611 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.694629 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:59Z","lastTransitionTime":"2026-01-27T15:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.797760 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.797820 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.797836 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.797858 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.797870 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:59Z","lastTransitionTime":"2026-01-27T15:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.871561 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 14:03:55.959526148 +0000 UTC Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.899242 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.899300 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.899436 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:47:59 crc kubenswrapper[4713]: E0127 15:47:59.899584 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.899691 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:47:59 crc kubenswrapper[4713]: E0127 15:47:59.899781 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:47:59 crc kubenswrapper[4713]: E0127 15:47:59.899925 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:47:59 crc kubenswrapper[4713]: E0127 15:47:59.900083 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.900912 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.900956 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.900974 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.900996 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:47:59 crc kubenswrapper[4713]: I0127 15:47:59.901011 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:47:59Z","lastTransitionTime":"2026-01-27T15:47:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.004523 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.004580 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.004594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.004615 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.004668 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:00Z","lastTransitionTime":"2026-01-27T15:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.108342 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.108393 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.108404 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.108425 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.108438 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:00Z","lastTransitionTime":"2026-01-27T15:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.212063 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.212149 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.212166 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.212188 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.212203 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:00Z","lastTransitionTime":"2026-01-27T15:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.315779 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.315835 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.315846 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.315868 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.315883 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:00Z","lastTransitionTime":"2026-01-27T15:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.418774 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.418857 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.418875 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.418912 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.418930 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:00Z","lastTransitionTime":"2026-01-27T15:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.521881 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.521938 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.521947 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.521966 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.521978 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:00Z","lastTransitionTime":"2026-01-27T15:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.625515 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.625571 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.625587 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.625611 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.625627 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:00Z","lastTransitionTime":"2026-01-27T15:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.728276 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.728324 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.728334 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.728349 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.728359 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:00Z","lastTransitionTime":"2026-01-27T15:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.830529 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.830584 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.830609 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.830632 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.830650 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:00Z","lastTransitionTime":"2026-01-27T15:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.872664 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 15:12:26.458582877 +0000 UTC Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.933778 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.933838 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.933852 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.933873 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:00 crc kubenswrapper[4713]: I0127 15:48:00.933885 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:00Z","lastTransitionTime":"2026-01-27T15:48:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.036653 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.036716 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.036730 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.036753 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.036767 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:01Z","lastTransitionTime":"2026-01-27T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.139800 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.139869 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.139881 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.139901 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.139912 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:01Z","lastTransitionTime":"2026-01-27T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.242901 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.242944 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.242955 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.242973 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.242984 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:01Z","lastTransitionTime":"2026-01-27T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.346204 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.346247 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.346260 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.346277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.346288 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:01Z","lastTransitionTime":"2026-01-27T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.449424 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.449483 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.449494 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.449510 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.449521 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:01Z","lastTransitionTime":"2026-01-27T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.552310 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.552356 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.552365 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.552387 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.552397 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:01Z","lastTransitionTime":"2026-01-27T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.655153 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.655209 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.655225 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.655249 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.655265 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:01Z","lastTransitionTime":"2026-01-27T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.759451 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.759515 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.759538 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.759569 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.759591 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:01Z","lastTransitionTime":"2026-01-27T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.862330 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.862377 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.862392 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.862416 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.862433 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:01Z","lastTransitionTime":"2026-01-27T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.873083 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 13:53:11.118563331 +0000 UTC Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.898822 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.898853 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.898918 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:01 crc kubenswrapper[4713]: E0127 15:48:01.898971 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.899180 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:01 crc kubenswrapper[4713]: E0127 15:48:01.899239 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:01 crc kubenswrapper[4713]: E0127 15:48:01.899724 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:01 crc kubenswrapper[4713]: E0127 15:48:01.899873 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.965100 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.965145 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.965156 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.965174 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:01 crc kubenswrapper[4713]: I0127 15:48:01.965186 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:01Z","lastTransitionTime":"2026-01-27T15:48:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.068129 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.068161 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.068170 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.068186 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.068196 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:02Z","lastTransitionTime":"2026-01-27T15:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.171061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.171125 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.171141 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.171162 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.171174 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:02Z","lastTransitionTime":"2026-01-27T15:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.274874 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.274945 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.274967 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.275001 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.275025 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:02Z","lastTransitionTime":"2026-01-27T15:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.378099 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.378144 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.378154 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.378170 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.378180 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:02Z","lastTransitionTime":"2026-01-27T15:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.480897 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.481406 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.481545 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.481659 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.481744 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:02Z","lastTransitionTime":"2026-01-27T15:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.585655 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.586138 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.586269 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.586383 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.586489 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:02Z","lastTransitionTime":"2026-01-27T15:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.690211 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.690847 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.690964 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.691098 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.691231 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:02Z","lastTransitionTime":"2026-01-27T15:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.793829 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.793884 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.793899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.793923 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.793939 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:02Z","lastTransitionTime":"2026-01-27T15:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.873862 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 19:34:31.282090085 +0000 UTC Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.897397 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.897632 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.897645 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.897663 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.897678 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:02Z","lastTransitionTime":"2026-01-27T15:48:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.914143 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:02Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.928080 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:02Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.941857 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:02Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.956269 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:02Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.967784 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:02Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:02 crc kubenswrapper[4713]: I0127 15:48:02.981489 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:02Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.001372 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.001727 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.001850 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.001999 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.002162 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:03Z","lastTransitionTime":"2026-01-27T15:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.011794 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:03Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.026486 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:03Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.040717 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:03Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.052602 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:03Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.067276 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:03Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.081828 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:03Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.095558 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:03Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.105122 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.105162 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.105188 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.105207 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.105218 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:03Z","lastTransitionTime":"2026-01-27T15:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.115627 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"d/etcd-crc\\\\nI0127 15:47:48.246451 6164 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246457 6164 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0127 15:47:48.246461 6164 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0127 15:47:48.246465 6164 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246324 6164 services_controller.go:434] Service openshift-image-registry/image-registry retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{image-registry openshift-image-registry 8a06dbaa-f3c9-4dca-b7f2-c0a78edd88d0 19654 0 2025-02-24 06:08:59 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[docker-registry:default] map[imageregistry.operator.openshift.io/checksum:sha256:1c19715a76014ae1d56140d6390a08f14f453c1a59dc36c15718f40c638ef63d service.alpha.openshift.io/serving-cert-secret-name:image-registry-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:5000-tcp,Protocol:TCP,Port:5000,TargetPort:{0 5000 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{docker-registry: de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:03Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.127991 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:03Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.139772 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:03Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.153697 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:03Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.207365 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.207659 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.207792 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.207886 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.207970 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:03Z","lastTransitionTime":"2026-01-27T15:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.310643 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.310671 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.310679 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.310693 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.310703 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:03Z","lastTransitionTime":"2026-01-27T15:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.414255 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.414304 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.414315 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.414333 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.414347 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:03Z","lastTransitionTime":"2026-01-27T15:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.517157 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.517198 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.517207 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.517221 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.517251 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:03Z","lastTransitionTime":"2026-01-27T15:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.619404 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.619469 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.619479 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.619512 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.619524 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:03Z","lastTransitionTime":"2026-01-27T15:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.706506 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.706894 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:48:35.706841184 +0000 UTC m=+83.485051152 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.723489 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.723579 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.723610 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.723642 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.723665 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:03Z","lastTransitionTime":"2026-01-27T15:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.807924 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.807981 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.808011 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.808063 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.808154 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.808227 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:48:35.808204387 +0000 UTC m=+83.586414325 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.808227 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.808279 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.808321 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.808348 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.808380 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.808323 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:48:35.80829604 +0000 UTC m=+83.586506008 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.808424 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.808447 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.808448 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:48:35.808423904 +0000 UTC m=+83.586633872 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.808517 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:48:35.808494676 +0000 UTC m=+83.586704654 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.827075 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.827164 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.827196 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.827225 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.827248 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:03Z","lastTransitionTime":"2026-01-27T15:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.874666 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 10:50:40.407103544 +0000 UTC Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.899282 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.899331 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.899575 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.899625 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.899717 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.899793 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.899822 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:03 crc kubenswrapper[4713]: E0127 15:48:03.899963 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.930388 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.930433 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.930442 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.930457 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:03 crc kubenswrapper[4713]: I0127 15:48:03.930467 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:03Z","lastTransitionTime":"2026-01-27T15:48:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.033674 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.033738 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.033749 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.033770 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.033781 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:04Z","lastTransitionTime":"2026-01-27T15:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.140325 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.140481 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.141129 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.141154 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.141165 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:04Z","lastTransitionTime":"2026-01-27T15:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.242931 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.242991 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.243003 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.243023 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.243059 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:04Z","lastTransitionTime":"2026-01-27T15:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.345723 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.345776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.345786 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.345803 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.345816 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:04Z","lastTransitionTime":"2026-01-27T15:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.448837 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.448912 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.448922 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.448941 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.448953 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:04Z","lastTransitionTime":"2026-01-27T15:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.551244 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.551304 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.551316 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.551333 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.551343 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:04Z","lastTransitionTime":"2026-01-27T15:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.654849 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.654890 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.654901 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.654921 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.654932 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:04Z","lastTransitionTime":"2026-01-27T15:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.757296 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.757348 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.757365 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.757386 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.757401 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:04Z","lastTransitionTime":"2026-01-27T15:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.860857 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.860904 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.860915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.860930 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.860942 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:04Z","lastTransitionTime":"2026-01-27T15:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.875428 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 13:40:01.38978991 +0000 UTC Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.963916 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.963979 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.963991 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.964007 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:04 crc kubenswrapper[4713]: I0127 15:48:04.964019 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:04Z","lastTransitionTime":"2026-01-27T15:48:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.066814 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.066854 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.066866 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.066883 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.066898 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:05Z","lastTransitionTime":"2026-01-27T15:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.169610 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.169644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.169653 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.169667 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.169678 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:05Z","lastTransitionTime":"2026-01-27T15:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.273470 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.273540 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.273562 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.273584 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.273599 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:05Z","lastTransitionTime":"2026-01-27T15:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.377067 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.377119 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.377129 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.377148 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.377161 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:05Z","lastTransitionTime":"2026-01-27T15:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.479357 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.479426 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.479438 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.479477 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.479493 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:05Z","lastTransitionTime":"2026-01-27T15:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.581884 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.581960 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.581973 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.581994 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.582009 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:05Z","lastTransitionTime":"2026-01-27T15:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.684869 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.684932 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.684943 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.684964 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.684976 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:05Z","lastTransitionTime":"2026-01-27T15:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.787876 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.787925 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.787935 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.787953 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.787962 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:05Z","lastTransitionTime":"2026-01-27T15:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.876326 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 18:51:10.911474607 +0000 UTC Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.890765 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.890806 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.890821 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.890843 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.890861 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:05Z","lastTransitionTime":"2026-01-27T15:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.899116 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.899139 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.899201 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.899243 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:05 crc kubenswrapper[4713]: E0127 15:48:05.899354 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:05 crc kubenswrapper[4713]: E0127 15:48:05.899461 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:05 crc kubenswrapper[4713]: E0127 15:48:05.899521 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:05 crc kubenswrapper[4713]: E0127 15:48:05.899601 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.994548 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.994622 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.994637 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.994677 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:05 crc kubenswrapper[4713]: I0127 15:48:05.994690 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:05Z","lastTransitionTime":"2026-01-27T15:48:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.099000 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.099067 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.099077 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.099096 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.099107 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:06Z","lastTransitionTime":"2026-01-27T15:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.201935 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.201977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.201991 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.202012 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.202027 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:06Z","lastTransitionTime":"2026-01-27T15:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.229709 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs\") pod \"network-metrics-daemon-mdw5k\" (UID: \"81e4c47d-af8d-44f0-beff-17cf5f133ff7\") " pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:06 crc kubenswrapper[4713]: E0127 15:48:06.230058 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:48:06 crc kubenswrapper[4713]: E0127 15:48:06.230395 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs podName:81e4c47d-af8d-44f0-beff-17cf5f133ff7 nodeName:}" failed. No retries permitted until 2026-01-27 15:48:22.230301994 +0000 UTC m=+70.008511952 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs") pod "network-metrics-daemon-mdw5k" (UID: "81e4c47d-af8d-44f0-beff-17cf5f133ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.304863 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.304900 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.304925 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.304942 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.304952 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:06Z","lastTransitionTime":"2026-01-27T15:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.408004 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.408096 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.408110 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.408128 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.408140 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:06Z","lastTransitionTime":"2026-01-27T15:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.510836 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.510874 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.510887 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.510907 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.510919 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:06Z","lastTransitionTime":"2026-01-27T15:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.613523 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.613899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.613978 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.614080 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.614177 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:06Z","lastTransitionTime":"2026-01-27T15:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.717643 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.717691 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.717702 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.717723 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.717736 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:06Z","lastTransitionTime":"2026-01-27T15:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.820475 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.820545 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.820561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.820589 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.820607 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:06Z","lastTransitionTime":"2026-01-27T15:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.877181 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-12 11:56:02.723386028 +0000 UTC Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.923700 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.924147 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.924354 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.924543 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:06 crc kubenswrapper[4713]: I0127 15:48:06.924788 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:06Z","lastTransitionTime":"2026-01-27T15:48:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.027501 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.027771 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.027879 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.027947 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.028013 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:07Z","lastTransitionTime":"2026-01-27T15:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.130730 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.130778 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.130789 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.130808 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.130820 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:07Z","lastTransitionTime":"2026-01-27T15:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.233937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.233989 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.234000 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.234018 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.234029 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:07Z","lastTransitionTime":"2026-01-27T15:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.336211 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.336266 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.336276 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.336294 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.336306 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:07Z","lastTransitionTime":"2026-01-27T15:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.439187 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.439238 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.439252 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.439270 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.439283 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:07Z","lastTransitionTime":"2026-01-27T15:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.542431 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.542480 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.542493 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.542514 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.542527 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:07Z","lastTransitionTime":"2026-01-27T15:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.644661 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.644725 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.644734 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.644748 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.644757 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:07Z","lastTransitionTime":"2026-01-27T15:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.696597 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.708910 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.714572 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.756878 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.757222 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.757716 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.757822 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.757909 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:07Z","lastTransitionTime":"2026-01-27T15:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.759937 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.775634 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.789587 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.809402 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.833348 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"d/etcd-crc\\\\nI0127 15:47:48.246451 6164 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246457 6164 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0127 15:47:48.246461 6164 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0127 15:47:48.246465 6164 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246324 6164 services_controller.go:434] Service openshift-image-registry/image-registry retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{image-registry openshift-image-registry 8a06dbaa-f3c9-4dca-b7f2-c0a78edd88d0 19654 0 2025-02-24 06:08:59 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[docker-registry:default] map[imageregistry.operator.openshift.io/checksum:sha256:1c19715a76014ae1d56140d6390a08f14f453c1a59dc36c15718f40c638ef63d service.alpha.openshift.io/serving-cert-secret-name:image-registry-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:5000-tcp,Protocol:TCP,Port:5000,TargetPort:{0 5000 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{docker-registry: de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.850165 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.860409 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.860456 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.860465 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.860483 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.860495 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:07Z","lastTransitionTime":"2026-01-27T15:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.864933 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.877323 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.878598 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 12:54:15.076000678 +0000 UTC Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.891413 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.898996 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.899010 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.899113 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.899156 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:07 crc kubenswrapper[4713]: E0127 15:48:07.899140 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:07 crc kubenswrapper[4713]: E0127 15:48:07.899244 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:07 crc kubenswrapper[4713]: E0127 15:48:07.899398 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:07 crc kubenswrapper[4713]: E0127 15:48:07.899491 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.905143 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.920277 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.932255 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.942854 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.953480 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.962454 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.962536 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.962576 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.962598 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.962609 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:07Z","lastTransitionTime":"2026-01-27T15:48:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.974098 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:07 crc kubenswrapper[4713]: I0127 15:48:07.986990 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:07Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.065301 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.065356 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.065372 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.065397 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.065415 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:08Z","lastTransitionTime":"2026-01-27T15:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.168369 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.168419 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.168433 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.168453 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.168466 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:08Z","lastTransitionTime":"2026-01-27T15:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.270514 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.270572 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.270590 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.270616 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.270630 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:08Z","lastTransitionTime":"2026-01-27T15:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.373158 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.373237 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.373257 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.373285 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.373304 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:08Z","lastTransitionTime":"2026-01-27T15:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.476410 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.476475 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.476488 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.476510 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.476525 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:08Z","lastTransitionTime":"2026-01-27T15:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.579108 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.579162 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.579175 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.579193 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.579205 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:08Z","lastTransitionTime":"2026-01-27T15:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.681481 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.681537 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.681545 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.681560 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.681571 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:08Z","lastTransitionTime":"2026-01-27T15:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.784186 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.784227 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.784238 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.784256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.784268 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:08Z","lastTransitionTime":"2026-01-27T15:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.834063 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.834139 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.834165 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.834201 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.834241 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:08Z","lastTransitionTime":"2026-01-27T15:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:08 crc kubenswrapper[4713]: E0127 15:48:08.850395 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:08Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.855612 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.855733 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.855751 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.855778 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.855823 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:08Z","lastTransitionTime":"2026-01-27T15:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:08 crc kubenswrapper[4713]: E0127 15:48:08.870945 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:08Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.874839 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.874873 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.874882 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.874899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.874909 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:08Z","lastTransitionTime":"2026-01-27T15:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.878685 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 04:49:55.649982647 +0000 UTC Jan 27 15:48:08 crc kubenswrapper[4713]: E0127 15:48:08.888993 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:08Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.893887 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.893921 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.893932 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.893952 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.893966 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:08Z","lastTransitionTime":"2026-01-27T15:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:08 crc kubenswrapper[4713]: E0127 15:48:08.908004 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:08Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.911846 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.911898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.911908 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.911923 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.911934 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:08Z","lastTransitionTime":"2026-01-27T15:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:08 crc kubenswrapper[4713]: E0127 15:48:08.924522 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:08Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:08 crc kubenswrapper[4713]: E0127 15:48:08.924667 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.926541 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.926570 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.926594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.926610 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:08 crc kubenswrapper[4713]: I0127 15:48:08.926622 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:08Z","lastTransitionTime":"2026-01-27T15:48:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.029334 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.029467 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.029480 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.029497 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.029507 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:09Z","lastTransitionTime":"2026-01-27T15:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.132155 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.132192 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.132200 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.132218 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.132232 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:09Z","lastTransitionTime":"2026-01-27T15:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.236013 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.236102 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.236113 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.236132 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.236141 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:09Z","lastTransitionTime":"2026-01-27T15:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.339131 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.339199 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.339214 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.339234 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.339248 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:09Z","lastTransitionTime":"2026-01-27T15:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.441915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.442006 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.442029 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.442095 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.442120 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:09Z","lastTransitionTime":"2026-01-27T15:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.544789 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.544863 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.544882 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.544912 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.544931 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:09Z","lastTransitionTime":"2026-01-27T15:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.647833 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.647905 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.647922 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.647943 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.647957 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:09Z","lastTransitionTime":"2026-01-27T15:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.750894 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.750959 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.750970 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.750995 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.751009 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:09Z","lastTransitionTime":"2026-01-27T15:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.853993 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.854064 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.854073 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.854091 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.854101 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:09Z","lastTransitionTime":"2026-01-27T15:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.879610 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 23:42:58.231986861 +0000 UTC Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.899112 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.899162 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.899162 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.899112 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:09 crc kubenswrapper[4713]: E0127 15:48:09.899335 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:09 crc kubenswrapper[4713]: E0127 15:48:09.899457 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:09 crc kubenswrapper[4713]: E0127 15:48:09.899558 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:09 crc kubenswrapper[4713]: E0127 15:48:09.899647 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.956866 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.956912 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.956926 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.956945 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:09 crc kubenswrapper[4713]: I0127 15:48:09.956958 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:09Z","lastTransitionTime":"2026-01-27T15:48:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.060443 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.060501 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.060513 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.060532 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.060546 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:10Z","lastTransitionTime":"2026-01-27T15:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.164329 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.164388 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.164398 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.164416 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.164425 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:10Z","lastTransitionTime":"2026-01-27T15:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.267407 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.267449 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.267459 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.267476 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.267487 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:10Z","lastTransitionTime":"2026-01-27T15:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.371352 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.371429 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.371452 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.371484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.371507 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:10Z","lastTransitionTime":"2026-01-27T15:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.474471 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.474513 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.474525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.474542 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.474555 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:10Z","lastTransitionTime":"2026-01-27T15:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.577850 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.577899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.577909 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.577928 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.577940 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:10Z","lastTransitionTime":"2026-01-27T15:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.680557 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.680931 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.681015 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.681133 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.681201 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:10Z","lastTransitionTime":"2026-01-27T15:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.784507 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.784566 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.784579 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.784601 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.784613 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:10Z","lastTransitionTime":"2026-01-27T15:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.880772 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 05:05:11.959707991 +0000 UTC Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.887546 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.887760 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.887867 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.887937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.888028 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:10Z","lastTransitionTime":"2026-01-27T15:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.991157 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.991211 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.991223 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.991241 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:10 crc kubenswrapper[4713]: I0127 15:48:10.991254 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:10Z","lastTransitionTime":"2026-01-27T15:48:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.093650 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.093696 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.093705 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.093725 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.093736 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:11Z","lastTransitionTime":"2026-01-27T15:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.197485 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.197539 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.197555 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.197582 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.197603 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:11Z","lastTransitionTime":"2026-01-27T15:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.300682 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.300774 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.300798 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.300827 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.300849 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:11Z","lastTransitionTime":"2026-01-27T15:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.403845 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.403903 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.403915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.403936 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.403949 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:11Z","lastTransitionTime":"2026-01-27T15:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.507393 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.507429 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.507437 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.507452 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.507472 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:11Z","lastTransitionTime":"2026-01-27T15:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.610762 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.610883 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.610895 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.610916 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.610929 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:11Z","lastTransitionTime":"2026-01-27T15:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.714467 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.714520 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.714529 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.714551 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.714563 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:11Z","lastTransitionTime":"2026-01-27T15:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.818100 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.818212 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.818245 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.818280 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.818304 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:11Z","lastTransitionTime":"2026-01-27T15:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.881224 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 08:38:33.853476974 +0000 UTC Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.898758 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.898796 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.898781 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.898758 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:11 crc kubenswrapper[4713]: E0127 15:48:11.898921 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:11 crc kubenswrapper[4713]: E0127 15:48:11.899223 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:11 crc kubenswrapper[4713]: E0127 15:48:11.899525 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:11 crc kubenswrapper[4713]: E0127 15:48:11.899739 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.899773 4713 scope.go:117] "RemoveContainer" containerID="1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.921068 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.921108 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.921120 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.921138 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:11 crc kubenswrapper[4713]: I0127 15:48:11.921152 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:11Z","lastTransitionTime":"2026-01-27T15:48:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.030091 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.030493 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.030610 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.030641 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.030659 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:12Z","lastTransitionTime":"2026-01-27T15:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.134206 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.134263 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.134273 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.134291 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.134301 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:12Z","lastTransitionTime":"2026-01-27T15:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.237393 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.237428 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.237437 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.237455 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.237467 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:12Z","lastTransitionTime":"2026-01-27T15:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.274450 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/1.log" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.277552 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerStarted","Data":"bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8"} Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.278457 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.302635 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.324090 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.340969 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.341029 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.341058 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.341079 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.341094 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:12Z","lastTransitionTime":"2026-01-27T15:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.364121 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.380389 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.411027 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.425524 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.443625 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.443763 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.444100 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.444114 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.444133 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.444147 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:12Z","lastTransitionTime":"2026-01-27T15:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.459939 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.473107 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.488839 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d449c12-7b67-4bee-9ce4-66a208d06c54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2518ddc03128d7979d73c3f98bd42c82e8bf4a79a04827d79c8b628d71d99eee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5bb468807f63419d201af98fac94f10979f6f52f9111f699d80271ab3d93ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7869feca99a1ab37cbd376ff4c7255b0e61aa393f2d65d27035c8e962c3ed24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.503766 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.521004 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.542116 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"d/etcd-crc\\\\nI0127 15:47:48.246451 6164 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246457 6164 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0127 15:47:48.246461 6164 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0127 15:47:48.246465 6164 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246324 6164 services_controller.go:434] Service openshift-image-registry/image-registry retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{image-registry openshift-image-registry 8a06dbaa-f3c9-4dca-b7f2-c0a78edd88d0 19654 0 2025-02-24 06:08:59 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[docker-registry:default] map[imageregistry.operator.openshift.io/checksum:sha256:1c19715a76014ae1d56140d6390a08f14f453c1a59dc36c15718f40c638ef63d service.alpha.openshift.io/serving-cert-secret-name:image-registry-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:5000-tcp,Protocol:TCP,Port:5000,TargetPort:{0 5000 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{docker-registry: de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.547482 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.547535 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.547547 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.547569 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.547581 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:12Z","lastTransitionTime":"2026-01-27T15:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.555521 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.568381 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.582595 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.597397 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.609859 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.649558 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.649868 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.650002 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.650132 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.650247 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:12Z","lastTransitionTime":"2026-01-27T15:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.753024 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.753094 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.753106 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.753123 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.753132 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:12Z","lastTransitionTime":"2026-01-27T15:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.856467 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.856512 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.856522 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.856541 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.856553 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:12Z","lastTransitionTime":"2026-01-27T15:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.882307 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 10:38:44.977662261 +0000 UTC Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.912887 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.924136 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.934952 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.947705 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.959309 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.959379 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.959403 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.959434 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.959457 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:12Z","lastTransitionTime":"2026-01-27T15:48:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.967081 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"d/etcd-crc\\\\nI0127 15:47:48.246451 6164 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246457 6164 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0127 15:47:48.246461 6164 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0127 15:47:48.246465 6164 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246324 6164 services_controller.go:434] Service openshift-image-registry/image-registry retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{image-registry openshift-image-registry 8a06dbaa-f3c9-4dca-b7f2-c0a78edd88d0 19654 0 2025-02-24 06:08:59 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[docker-registry:default] map[imageregistry.operator.openshift.io/checksum:sha256:1c19715a76014ae1d56140d6390a08f14f453c1a59dc36c15718f40c638ef63d service.alpha.openshift.io/serving-cert-secret-name:image-registry-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:5000-tcp,Protocol:TCP,Port:5000,TargetPort:{0 5000 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{docker-registry: de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.979632 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:12 crc kubenswrapper[4713]: I0127 15:48:12.995620 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:12Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.013371 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.033814 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.046220 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.057566 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.062911 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.062974 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.062993 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.063021 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.063069 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:13Z","lastTransitionTime":"2026-01-27T15:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.071154 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.079672 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.090685 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.103120 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d449c12-7b67-4bee-9ce4-66a208d06c54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2518ddc03128d7979d73c3f98bd42c82e8bf4a79a04827d79c8b628d71d99eee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5bb468807f63419d201af98fac94f10979f6f52f9111f699d80271ab3d93ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7869feca99a1ab37cbd376ff4c7255b0e61aa393f2d65d27035c8e962c3ed24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.115359 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.127352 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.137707 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.164854 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.164892 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.164900 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.164916 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.164927 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:13Z","lastTransitionTime":"2026-01-27T15:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.268781 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.269256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.269399 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.269584 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.269738 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:13Z","lastTransitionTime":"2026-01-27T15:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.284885 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/2.log" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.285930 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/1.log" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.290627 4713 generic.go:334] "Generic (PLEG): container finished" podID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerID="bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8" exitCode=1 Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.290696 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerDied","Data":"bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8"} Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.290761 4713 scope.go:117] "RemoveContainer" containerID="1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.292259 4713 scope.go:117] "RemoveContainer" containerID="bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8" Jan 27 15:48:13 crc kubenswrapper[4713]: E0127 15:48:13.292677 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.312529 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.346673 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.367316 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.373202 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.373279 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.373304 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.373335 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.373358 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:13Z","lastTransitionTime":"2026-01-27T15:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.381858 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.399348 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.415871 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.432487 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d449c12-7b67-4bee-9ce4-66a208d06c54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2518ddc03128d7979d73c3f98bd42c82e8bf4a79a04827d79c8b628d71d99eee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5bb468807f63419d201af98fac94f10979f6f52f9111f699d80271ab3d93ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7869feca99a1ab37cbd376ff4c7255b0e61aa393f2d65d27035c8e962c3ed24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.447469 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.458668 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.471736 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.476526 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.476835 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.476908 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.476975 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.477116 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:13Z","lastTransitionTime":"2026-01-27T15:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.484076 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.497995 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.511868 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.524255 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.545329 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.567615 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1eee480170cb9942c12f026b2f7007bb65f4cd719c8bb874ae919ede0bab6373\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"message\\\":\\\"d/etcd-crc\\\\nI0127 15:47:48.246451 6164 obj_retry.go:365] Adding new object: *v1.Pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246457 6164 ovn.go:134] Ensuring zone local for Pod openshift-etcd/etcd-crc in node crc\\\\nI0127 15:47:48.246461 6164 obj_retry.go:386] Retry successful for *v1.Pod openshift-etcd/etcd-crc after 0 failed attempt(s)\\\\nI0127 15:47:48.246465 6164 default_network_controller.go:776] Recording success event on pod openshift-etcd/etcd-crc\\\\nI0127 15:47:48.246324 6164 services_controller.go:434] Service openshift-image-registry/image-registry retrieved from lister for network=default: \\\\u0026Service{ObjectMeta:{image-registry openshift-image-registry 8a06dbaa-f3c9-4dca-b7f2-c0a78edd88d0 19654 0 2025-02-24 06:08:59 +0000 UTC \\\\u003cnil\\\\u003e \\\\u003cnil\\\\u003e map[docker-registry:default] map[imageregistry.operator.openshift.io/checksum:sha256:1c19715a76014ae1d56140d6390a08f14f453c1a59dc36c15718f40c638ef63d service.alpha.openshift.io/serving-cert-secret-name:image-registry-tls service.alpha.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168 service.beta.openshift.io/serving-cert-signed-by:openshift-service-serving-signer@1740288168] [] [] []},Spec:ServiceSpec{Ports:[]ServicePort{ServicePort{Name:5000-tcp,Protocol:TCP,Port:5000,TargetPort:{0 5000 },NodePort:0,AppProtocol:nil,},},Selector:map[string]string{docker-registry: de\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:12Z\\\",\\\"message\\\":\\\"0127 15:48:12.771294 6448 obj_retry.go:409] Going to retry *v1.Pod resource setup for 16 objects: [openshift-multus/multus-additional-cni-plugins-gb57w openshift-network-operator/iptables-alerter-4ln5h openshift-kube-apiserver/kube-apiserver-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm openshift-etcd/etcd-crc openshift-image-registry/node-ca-4l8xv openshift-multus/network-metrics-daemon-mdw5k openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-xs9tk openshift-machine-config-operator/machine-config-daemon-6h5wz openshift-multus/multus-n7wxq openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-kube-controller-manager/kube-controller-manager-crc]\\\\nI0127 15:48:12.771329 6448 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0127 15:48:12.771336 6448 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:48:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.579438 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.579623 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.579779 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.579936 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.580029 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:13Z","lastTransitionTime":"2026-01-27T15:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.588222 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.601693 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:13Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.683521 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.683629 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.683655 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.683687 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.683710 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:13Z","lastTransitionTime":"2026-01-27T15:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.786849 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.786926 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.786934 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.786958 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.786981 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:13Z","lastTransitionTime":"2026-01-27T15:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.882988 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 12:12:57.169032947 +0000 UTC Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.890435 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.890475 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.890490 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.890512 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.890528 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:13Z","lastTransitionTime":"2026-01-27T15:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.898911 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.898931 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.898963 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:13 crc kubenswrapper[4713]: E0127 15:48:13.899088 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.899135 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:13 crc kubenswrapper[4713]: E0127 15:48:13.899334 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:13 crc kubenswrapper[4713]: E0127 15:48:13.899328 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:13 crc kubenswrapper[4713]: E0127 15:48:13.899444 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.993520 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.993572 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.993620 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.993640 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:13 crc kubenswrapper[4713]: I0127 15:48:13.993655 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:13Z","lastTransitionTime":"2026-01-27T15:48:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.096883 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.096937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.096949 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.096967 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.096980 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:14Z","lastTransitionTime":"2026-01-27T15:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.200669 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.200756 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.200772 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.200794 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.200817 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:14Z","lastTransitionTime":"2026-01-27T15:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.297529 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/2.log" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.302978 4713 scope.go:117] "RemoveContainer" containerID="bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8" Jan 27 15:48:14 crc kubenswrapper[4713]: E0127 15:48:14.303345 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.305095 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.305147 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.305166 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.305192 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.305213 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:14Z","lastTransitionTime":"2026-01-27T15:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.321756 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.344647 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.371170 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.387619 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.404020 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.408183 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.408223 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.408236 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.408255 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.408300 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:14Z","lastTransitionTime":"2026-01-27T15:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.419334 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.431136 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.443655 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.456482 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d449c12-7b67-4bee-9ce4-66a208d06c54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2518ddc03128d7979d73c3f98bd42c82e8bf4a79a04827d79c8b628d71d99eee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5bb468807f63419d201af98fac94f10979f6f52f9111f699d80271ab3d93ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7869feca99a1ab37cbd376ff4c7255b0e61aa393f2d65d27035c8e962c3ed24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.471488 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.482298 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.496400 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.510741 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.511495 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.511544 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.511555 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.511575 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.511587 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:14Z","lastTransitionTime":"2026-01-27T15:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.523640 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.536136 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.552382 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.571952 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:12Z\\\",\\\"message\\\":\\\"0127 15:48:12.771294 6448 obj_retry.go:409] Going to retry *v1.Pod resource setup for 16 objects: [openshift-multus/multus-additional-cni-plugins-gb57w openshift-network-operator/iptables-alerter-4ln5h openshift-kube-apiserver/kube-apiserver-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm openshift-etcd/etcd-crc openshift-image-registry/node-ca-4l8xv openshift-multus/network-metrics-daemon-mdw5k openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-xs9tk openshift-machine-config-operator/machine-config-daemon-6h5wz openshift-multus/multus-n7wxq openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-kube-controller-manager/kube-controller-manager-crc]\\\\nI0127 15:48:12.771329 6448 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0127 15:48:12.771336 6448 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:48:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.583720 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:14Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.614073 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.614119 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.614131 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.614147 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.614161 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:14Z","lastTransitionTime":"2026-01-27T15:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.716769 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.716809 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.716819 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.716838 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.716847 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:14Z","lastTransitionTime":"2026-01-27T15:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.819755 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.819801 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.819813 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.819832 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.819846 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:14Z","lastTransitionTime":"2026-01-27T15:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.883915 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 05:48:43.016230575 +0000 UTC Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.922151 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.922203 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.922215 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.922235 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:14 crc kubenswrapper[4713]: I0127 15:48:14.922250 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:14Z","lastTransitionTime":"2026-01-27T15:48:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.025051 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.025103 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.025115 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.025133 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.025145 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:15Z","lastTransitionTime":"2026-01-27T15:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.128184 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.128237 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.128253 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.128275 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.128292 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:15Z","lastTransitionTime":"2026-01-27T15:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.230885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.230939 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.230948 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.230973 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.230992 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:15Z","lastTransitionTime":"2026-01-27T15:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.333212 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.333244 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.333254 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.333268 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.333277 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:15Z","lastTransitionTime":"2026-01-27T15:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.435892 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.435935 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.435947 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.435963 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.435973 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:15Z","lastTransitionTime":"2026-01-27T15:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.538879 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.538926 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.538938 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.538957 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.538970 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:15Z","lastTransitionTime":"2026-01-27T15:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.641739 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.641794 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.641806 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.641825 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.641838 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:15Z","lastTransitionTime":"2026-01-27T15:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.745345 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.745399 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.745411 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.745430 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.745443 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:15Z","lastTransitionTime":"2026-01-27T15:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.848265 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.848306 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.848316 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.848333 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.848344 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:15Z","lastTransitionTime":"2026-01-27T15:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.885901 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 21:36:26.464022162 +0000 UTC Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.899471 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.899537 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.899471 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:15 crc kubenswrapper[4713]: E0127 15:48:15.899650 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.899711 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:15 crc kubenswrapper[4713]: E0127 15:48:15.899891 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:15 crc kubenswrapper[4713]: E0127 15:48:15.899997 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:15 crc kubenswrapper[4713]: E0127 15:48:15.900082 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.950826 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.950880 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.950888 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.950904 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:15 crc kubenswrapper[4713]: I0127 15:48:15.950914 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:15Z","lastTransitionTime":"2026-01-27T15:48:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.053390 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.053438 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.053452 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.053471 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.053485 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:16Z","lastTransitionTime":"2026-01-27T15:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.155956 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.155998 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.156009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.156032 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.156071 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:16Z","lastTransitionTime":"2026-01-27T15:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.259260 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.259376 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.259407 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.259435 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.259455 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:16Z","lastTransitionTime":"2026-01-27T15:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.362627 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.362676 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.362691 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.362712 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.362727 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:16Z","lastTransitionTime":"2026-01-27T15:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.466122 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.466162 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.466174 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.466197 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.466210 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:16Z","lastTransitionTime":"2026-01-27T15:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.568382 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.568429 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.568440 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.568459 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.568472 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:16Z","lastTransitionTime":"2026-01-27T15:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.671180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.671228 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.671248 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.671262 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.671275 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:16Z","lastTransitionTime":"2026-01-27T15:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.774399 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.774447 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.774458 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.774476 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.774488 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:16Z","lastTransitionTime":"2026-01-27T15:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.877479 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.877533 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.877545 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.877566 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.877579 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:16Z","lastTransitionTime":"2026-01-27T15:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.886660 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 11:24:44.705838185 +0000 UTC Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.980138 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.980186 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.980198 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.980216 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:16 crc kubenswrapper[4713]: I0127 15:48:16.980229 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:16Z","lastTransitionTime":"2026-01-27T15:48:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.083398 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.083461 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.083474 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.083496 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.083511 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:17Z","lastTransitionTime":"2026-01-27T15:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.186833 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.186938 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.186952 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.186970 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.186980 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:17Z","lastTransitionTime":"2026-01-27T15:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.289614 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.289670 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.289688 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.289711 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.289725 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:17Z","lastTransitionTime":"2026-01-27T15:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.392494 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.392538 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.392546 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.392564 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.392573 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:17Z","lastTransitionTime":"2026-01-27T15:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.495341 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.495374 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.495382 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.495399 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.495410 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:17Z","lastTransitionTime":"2026-01-27T15:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.598219 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.598271 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.598281 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.598301 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.598315 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:17Z","lastTransitionTime":"2026-01-27T15:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.700162 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.700193 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.700202 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.700219 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.700229 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:17Z","lastTransitionTime":"2026-01-27T15:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.803705 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.803756 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.803765 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.803784 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.803796 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:17Z","lastTransitionTime":"2026-01-27T15:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.887076 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 06:29:17.161787361 +0000 UTC Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.899445 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.899523 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:17 crc kubenswrapper[4713]: E0127 15:48:17.899607 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.899663 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:17 crc kubenswrapper[4713]: E0127 15:48:17.899706 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.899553 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:17 crc kubenswrapper[4713]: E0127 15:48:17.899846 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:17 crc kubenswrapper[4713]: E0127 15:48:17.899894 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.906672 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.906786 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.906906 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.907008 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:17 crc kubenswrapper[4713]: I0127 15:48:17.907136 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:17Z","lastTransitionTime":"2026-01-27T15:48:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.009975 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.010328 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.011070 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.011125 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.011141 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:18Z","lastTransitionTime":"2026-01-27T15:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.114065 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.114119 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.114128 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.114147 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.114156 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:18Z","lastTransitionTime":"2026-01-27T15:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.221371 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.221464 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.221477 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.221504 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.221523 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:18Z","lastTransitionTime":"2026-01-27T15:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.323746 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.323796 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.323807 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.323827 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.323841 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:18Z","lastTransitionTime":"2026-01-27T15:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.426754 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.426811 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.426827 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.426850 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.426867 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:18Z","lastTransitionTime":"2026-01-27T15:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.528874 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.528937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.528947 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.528961 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.528971 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:18Z","lastTransitionTime":"2026-01-27T15:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.631270 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.631310 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.631318 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.631333 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.631344 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:18Z","lastTransitionTime":"2026-01-27T15:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.734930 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.734977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.734987 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.735002 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.735014 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:18Z","lastTransitionTime":"2026-01-27T15:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.837916 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.837971 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.837984 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.838005 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.838017 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:18Z","lastTransitionTime":"2026-01-27T15:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.887933 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 06:54:53.984049718 +0000 UTC Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.941216 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.941276 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.941290 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.941312 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:18 crc kubenswrapper[4713]: I0127 15:48:18.941328 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:18Z","lastTransitionTime":"2026-01-27T15:48:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.044170 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.044219 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.044232 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.044253 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.044268 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:19Z","lastTransitionTime":"2026-01-27T15:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.129245 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.129283 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.129293 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.129308 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.129318 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:19Z","lastTransitionTime":"2026-01-27T15:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:19 crc kubenswrapper[4713]: E0127 15:48:19.142583 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.146300 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.146410 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.146432 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.146448 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.146459 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:19Z","lastTransitionTime":"2026-01-27T15:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:19 crc kubenswrapper[4713]: E0127 15:48:19.160628 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.164486 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.164547 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.164560 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.164583 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.164596 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:19Z","lastTransitionTime":"2026-01-27T15:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:19 crc kubenswrapper[4713]: E0127 15:48:19.176821 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.180342 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.180370 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.180381 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.180397 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.180407 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:19Z","lastTransitionTime":"2026-01-27T15:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:19 crc kubenswrapper[4713]: E0127 15:48:19.191385 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.194967 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.194992 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.195003 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.195022 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.195053 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:19Z","lastTransitionTime":"2026-01-27T15:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:19 crc kubenswrapper[4713]: E0127 15:48:19.207724 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:19Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:19 crc kubenswrapper[4713]: E0127 15:48:19.208015 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.209634 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.209744 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.209835 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.209929 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.210022 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:19Z","lastTransitionTime":"2026-01-27T15:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.312017 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.312069 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.312081 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.312098 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.312109 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:19Z","lastTransitionTime":"2026-01-27T15:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.415164 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.415214 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.415234 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.415256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.415269 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:19Z","lastTransitionTime":"2026-01-27T15:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.517819 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.517868 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.517880 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.517899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.517914 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:19Z","lastTransitionTime":"2026-01-27T15:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.620746 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.620786 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.620797 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.620813 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.620825 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:19Z","lastTransitionTime":"2026-01-27T15:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.723822 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.723874 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.723885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.723905 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.723919 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:19Z","lastTransitionTime":"2026-01-27T15:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.826565 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.826611 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.826624 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.826644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.826658 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:19Z","lastTransitionTime":"2026-01-27T15:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.888672 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 12:44:30.726933696 +0000 UTC Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.899337 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.899404 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.899469 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:19 crc kubenswrapper[4713]: E0127 15:48:19.899514 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.899425 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:19 crc kubenswrapper[4713]: E0127 15:48:19.899626 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:19 crc kubenswrapper[4713]: E0127 15:48:19.899811 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:19 crc kubenswrapper[4713]: E0127 15:48:19.899906 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.929843 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.929882 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.929893 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.929912 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:19 crc kubenswrapper[4713]: I0127 15:48:19.929926 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:19Z","lastTransitionTime":"2026-01-27T15:48:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.031959 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.032001 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.032009 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.032024 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.032059 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:20Z","lastTransitionTime":"2026-01-27T15:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.134679 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.134718 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.134727 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.134743 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.134754 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:20Z","lastTransitionTime":"2026-01-27T15:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.237902 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.237954 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.237966 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.237985 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.237998 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:20Z","lastTransitionTime":"2026-01-27T15:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.340485 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.340522 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.340537 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.340561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.340574 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:20Z","lastTransitionTime":"2026-01-27T15:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.443287 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.443333 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.443341 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.443357 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.443366 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:20Z","lastTransitionTime":"2026-01-27T15:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.545846 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.545895 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.545906 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.545926 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.545939 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:20Z","lastTransitionTime":"2026-01-27T15:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.648592 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.648642 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.648651 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.648667 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.648677 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:20Z","lastTransitionTime":"2026-01-27T15:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.751453 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.751504 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.751514 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.751531 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.751545 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:20Z","lastTransitionTime":"2026-01-27T15:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.854075 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.854470 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.854938 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.854968 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.854990 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:20Z","lastTransitionTime":"2026-01-27T15:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.888822 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 18:59:05.909855349 +0000 UTC Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.958027 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.958086 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.958100 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.958116 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:20 crc kubenswrapper[4713]: I0127 15:48:20.958128 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:20Z","lastTransitionTime":"2026-01-27T15:48:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.061218 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.061296 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.061308 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.061325 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.061337 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:21Z","lastTransitionTime":"2026-01-27T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.163818 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.163915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.163927 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.163947 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.163978 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:21Z","lastTransitionTime":"2026-01-27T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.267164 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.267204 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.267217 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.267236 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.267251 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:21Z","lastTransitionTime":"2026-01-27T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.370451 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.370758 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.370890 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.371000 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.371186 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:21Z","lastTransitionTime":"2026-01-27T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.474299 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.474338 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.474347 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.474361 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.474371 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:21Z","lastTransitionTime":"2026-01-27T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.577066 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.577416 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.577513 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.577594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.577670 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:21Z","lastTransitionTime":"2026-01-27T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.680583 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.680631 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.680639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.680658 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.680668 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:21Z","lastTransitionTime":"2026-01-27T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.783557 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.783612 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.783624 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.783643 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.783655 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:21Z","lastTransitionTime":"2026-01-27T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.887019 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.887100 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.887112 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.887131 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.887142 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:21Z","lastTransitionTime":"2026-01-27T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.889235 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 15:40:05.685374926 +0000 UTC Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.898516 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.898563 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.898588 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.898626 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:21 crc kubenswrapper[4713]: E0127 15:48:21.899295 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:21 crc kubenswrapper[4713]: E0127 15:48:21.899430 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:21 crc kubenswrapper[4713]: E0127 15:48:21.899527 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:21 crc kubenswrapper[4713]: E0127 15:48:21.899672 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.991222 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.991273 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.991282 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.991303 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:21 crc kubenswrapper[4713]: I0127 15:48:21.991314 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:21Z","lastTransitionTime":"2026-01-27T15:48:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.093629 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.093678 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.093689 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.093710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.093723 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:22Z","lastTransitionTime":"2026-01-27T15:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.196772 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.196828 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.196837 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.196858 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.196872 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:22Z","lastTransitionTime":"2026-01-27T15:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.286861 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs\") pod \"network-metrics-daemon-mdw5k\" (UID: \"81e4c47d-af8d-44f0-beff-17cf5f133ff7\") " pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:22 crc kubenswrapper[4713]: E0127 15:48:22.287110 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:48:22 crc kubenswrapper[4713]: E0127 15:48:22.287237 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs podName:81e4c47d-af8d-44f0-beff-17cf5f133ff7 nodeName:}" failed. No retries permitted until 2026-01-27 15:48:54.287208013 +0000 UTC m=+102.065418011 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs") pod "network-metrics-daemon-mdw5k" (UID: "81e4c47d-af8d-44f0-beff-17cf5f133ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.299529 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.299582 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.299591 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.299610 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.299621 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:22Z","lastTransitionTime":"2026-01-27T15:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.401972 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.402030 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.402062 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.402082 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.402097 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:22Z","lastTransitionTime":"2026-01-27T15:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.505092 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.505464 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.505549 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.505642 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.505721 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:22Z","lastTransitionTime":"2026-01-27T15:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.608663 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.608747 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.608756 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.608773 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.608784 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:22Z","lastTransitionTime":"2026-01-27T15:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.711786 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.711853 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.711866 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.711886 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.711896 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:22Z","lastTransitionTime":"2026-01-27T15:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.815322 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.815390 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.815400 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.815418 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.815433 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:22Z","lastTransitionTime":"2026-01-27T15:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.891163 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 07:52:13.054047915 +0000 UTC Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.913121 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.918353 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.918382 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.918392 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.918411 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.918423 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:22Z","lastTransitionTime":"2026-01-27T15:48:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.927733 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.947408 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:12Z\\\",\\\"message\\\":\\\"0127 15:48:12.771294 6448 obj_retry.go:409] Going to retry *v1.Pod resource setup for 16 objects: [openshift-multus/multus-additional-cni-plugins-gb57w openshift-network-operator/iptables-alerter-4ln5h openshift-kube-apiserver/kube-apiserver-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm openshift-etcd/etcd-crc openshift-image-registry/node-ca-4l8xv openshift-multus/network-metrics-daemon-mdw5k openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-xs9tk openshift-machine-config-operator/machine-config-daemon-6h5wz openshift-multus/multus-n7wxq openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-kube-controller-manager/kube-controller-manager-crc]\\\\nI0127 15:48:12.771329 6448 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0127 15:48:12.771336 6448 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:48:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.961280 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.975534 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:22 crc kubenswrapper[4713]: I0127 15:48:22.987505 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.001779 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:22Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.018412 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.020761 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.021210 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.021226 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.021245 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.021257 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:23Z","lastTransitionTime":"2026-01-27T15:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.032687 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.047497 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.062204 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.075158 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.097750 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.112068 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.124426 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.124459 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.124471 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.124490 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.124500 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:23Z","lastTransitionTime":"2026-01-27T15:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.124792 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.133749 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.143195 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.152682 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d449c12-7b67-4bee-9ce4-66a208d06c54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2518ddc03128d7979d73c3f98bd42c82e8bf4a79a04827d79c8b628d71d99eee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5bb468807f63419d201af98fac94f10979f6f52f9111f699d80271ab3d93ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7869feca99a1ab37cbd376ff4c7255b0e61aa393f2d65d27035c8e962c3ed24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:23Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.226441 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.226472 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.226480 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.226496 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.226505 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:23Z","lastTransitionTime":"2026-01-27T15:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.328916 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.328967 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.328978 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.328996 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.329062 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:23Z","lastTransitionTime":"2026-01-27T15:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.431647 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.432007 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.432124 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.432220 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.432306 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:23Z","lastTransitionTime":"2026-01-27T15:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.535668 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.536028 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.536211 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.536295 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.536359 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:23Z","lastTransitionTime":"2026-01-27T15:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.639226 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.639306 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.639324 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.639346 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.639360 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:23Z","lastTransitionTime":"2026-01-27T15:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.742384 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.742418 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.742428 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.742443 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.742455 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:23Z","lastTransitionTime":"2026-01-27T15:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.845555 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.845628 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.845650 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.845680 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.845701 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:23Z","lastTransitionTime":"2026-01-27T15:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.892072 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 23:24:08.109908377 +0000 UTC Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.898632 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.898662 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.898968 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:23 crc kubenswrapper[4713]: E0127 15:48:23.899208 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.899238 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:23 crc kubenswrapper[4713]: E0127 15:48:23.899266 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:23 crc kubenswrapper[4713]: E0127 15:48:23.899352 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:23 crc kubenswrapper[4713]: E0127 15:48:23.899441 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.948525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.948573 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.948584 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.948603 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:23 crc kubenswrapper[4713]: I0127 15:48:23.948615 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:23Z","lastTransitionTime":"2026-01-27T15:48:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.052116 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.052178 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.052195 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.052220 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.052240 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:24Z","lastTransitionTime":"2026-01-27T15:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.155827 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.155901 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.155924 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.155980 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.156016 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:24Z","lastTransitionTime":"2026-01-27T15:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.259389 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.259438 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.259453 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.259476 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.259492 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:24Z","lastTransitionTime":"2026-01-27T15:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.336410 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n7wxq_bf07c585-f90e-4416-a66c-d41547008320/kube-multus/0.log" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.336462 4713 generic.go:334] "Generic (PLEG): container finished" podID="bf07c585-f90e-4416-a66c-d41547008320" containerID="d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e" exitCode=1 Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.336494 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n7wxq" event={"ID":"bf07c585-f90e-4416-a66c-d41547008320","Type":"ContainerDied","Data":"d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e"} Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.336876 4713 scope.go:117] "RemoveContainer" containerID="d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.349168 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.361943 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.361986 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.361997 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.362016 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.362030 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:24Z","lastTransitionTime":"2026-01-27T15:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.371929 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.385844 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.400620 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.415651 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.429267 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.443338 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d449c12-7b67-4bee-9ce4-66a208d06c54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2518ddc03128d7979d73c3f98bd42c82e8bf4a79a04827d79c8b628d71d99eee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5bb468807f63419d201af98fac94f10979f6f52f9111f699d80271ab3d93ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7869feca99a1ab37cbd376ff4c7255b0e61aa393f2d65d27035c8e962c3ed24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.455155 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.465356 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.465395 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.465404 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.465420 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.465431 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:24Z","lastTransitionTime":"2026-01-27T15:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.468216 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.480726 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.491107 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.507892 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.522939 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.537478 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:24Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:23Z\\\",\\\"message\\\":\\\"2026-01-27T15:47:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8fa15a8c-4373-4269-8f8e-a02eb250f1c3\\\\n2026-01-27T15:47:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8fa15a8c-4373-4269-8f8e-a02eb250f1c3 to /host/opt/cni/bin/\\\\n2026-01-27T15:47:38Z [verbose] multus-daemon started\\\\n2026-01-27T15:47:38Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:48:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.550985 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.567617 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.567945 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.568052 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.568144 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.568220 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:24Z","lastTransitionTime":"2026-01-27T15:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.569209 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:12Z\\\",\\\"message\\\":\\\"0127 15:48:12.771294 6448 obj_retry.go:409] Going to retry *v1.Pod resource setup for 16 objects: [openshift-multus/multus-additional-cni-plugins-gb57w openshift-network-operator/iptables-alerter-4ln5h openshift-kube-apiserver/kube-apiserver-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm openshift-etcd/etcd-crc openshift-image-registry/node-ca-4l8xv openshift-multus/network-metrics-daemon-mdw5k openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-xs9tk openshift-machine-config-operator/machine-config-daemon-6h5wz openshift-multus/multus-n7wxq openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-kube-controller-manager/kube-controller-manager-crc]\\\\nI0127 15:48:12.771329 6448 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0127 15:48:12.771336 6448 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:48:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.583776 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.596561 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:24Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.671998 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.672062 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.672081 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.672099 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.672111 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:24Z","lastTransitionTime":"2026-01-27T15:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.774998 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.775063 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.775077 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.775101 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.775116 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:24Z","lastTransitionTime":"2026-01-27T15:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.878726 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.878781 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.878797 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.878823 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.878847 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:24Z","lastTransitionTime":"2026-01-27T15:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.893143 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 17:19:51.337999546 +0000 UTC Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.982331 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.982366 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.982374 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.982389 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:24 crc kubenswrapper[4713]: I0127 15:48:24.982398 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:24Z","lastTransitionTime":"2026-01-27T15:48:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.085405 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.085464 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.085477 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.085499 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.085510 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:25Z","lastTransitionTime":"2026-01-27T15:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.187977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.188061 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.188082 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.188102 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.188116 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:25Z","lastTransitionTime":"2026-01-27T15:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.291669 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.291727 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.291741 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.291759 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.291773 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:25Z","lastTransitionTime":"2026-01-27T15:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.342498 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n7wxq_bf07c585-f90e-4416-a66c-d41547008320/kube-multus/0.log" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.342562 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n7wxq" event={"ID":"bf07c585-f90e-4416-a66c-d41547008320","Type":"ContainerStarted","Data":"30964991c715fef1f0e7ee97a2a0bbbab1a671ddbd347113c96a318b76d91bfd"} Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.366611 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30964991c715fef1f0e7ee97a2a0bbbab1a671ddbd347113c96a318b76d91bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:23Z\\\",\\\"message\\\":\\\"2026-01-27T15:47:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8fa15a8c-4373-4269-8f8e-a02eb250f1c3\\\\n2026-01-27T15:47:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8fa15a8c-4373-4269-8f8e-a02eb250f1c3 to /host/opt/cni/bin/\\\\n2026-01-27T15:47:38Z [verbose] multus-daemon started\\\\n2026-01-27T15:47:38Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:48:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.382505 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.394772 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.394814 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.394827 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.394848 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.394863 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:25Z","lastTransitionTime":"2026-01-27T15:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.404869 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:12Z\\\",\\\"message\\\":\\\"0127 15:48:12.771294 6448 obj_retry.go:409] Going to retry *v1.Pod resource setup for 16 objects: [openshift-multus/multus-additional-cni-plugins-gb57w openshift-network-operator/iptables-alerter-4ln5h openshift-kube-apiserver/kube-apiserver-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm openshift-etcd/etcd-crc openshift-image-registry/node-ca-4l8xv openshift-multus/network-metrics-daemon-mdw5k openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-xs9tk openshift-machine-config-operator/machine-config-daemon-6h5wz openshift-multus/multus-n7wxq openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-kube-controller-manager/kube-controller-manager-crc]\\\\nI0127 15:48:12.771329 6448 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0127 15:48:12.771336 6448 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:48:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.421816 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.435819 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.450781 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.462464 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.477774 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.492726 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.497829 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.498003 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.498313 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.498492 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.498609 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:25Z","lastTransitionTime":"2026-01-27T15:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.509446 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.519305 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.528096 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.545870 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.559828 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.575285 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.585887 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.598422 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.601131 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.601168 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.601183 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.601204 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.601218 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:25Z","lastTransitionTime":"2026-01-27T15:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.613011 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d449c12-7b67-4bee-9ce4-66a208d06c54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2518ddc03128d7979d73c3f98bd42c82e8bf4a79a04827d79c8b628d71d99eee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5bb468807f63419d201af98fac94f10979f6f52f9111f699d80271ab3d93ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7869feca99a1ab37cbd376ff4c7255b0e61aa393f2d65d27035c8e962c3ed24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:25Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.703955 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.704006 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.704018 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.704052 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.704064 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:25Z","lastTransitionTime":"2026-01-27T15:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.806789 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.806848 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.806861 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.806882 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.806895 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:25Z","lastTransitionTime":"2026-01-27T15:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.893264 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 15:50:15.616539731 +0000 UTC Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.898675 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.898737 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.898803 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:25 crc kubenswrapper[4713]: E0127 15:48:25.898951 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.899064 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:25 crc kubenswrapper[4713]: E0127 15:48:25.899141 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:25 crc kubenswrapper[4713]: E0127 15:48:25.899261 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:25 crc kubenswrapper[4713]: E0127 15:48:25.899320 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.910209 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.910414 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.910501 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.910630 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:25 crc kubenswrapper[4713]: I0127 15:48:25.910718 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:25Z","lastTransitionTime":"2026-01-27T15:48:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.013213 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.013268 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.013281 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.013300 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.013313 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:26Z","lastTransitionTime":"2026-01-27T15:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.116930 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.116972 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.116982 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.116998 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.117011 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:26Z","lastTransitionTime":"2026-01-27T15:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.220853 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.221252 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.221495 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.221737 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.221924 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:26Z","lastTransitionTime":"2026-01-27T15:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.325030 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.325109 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.325121 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.325141 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.325155 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:26Z","lastTransitionTime":"2026-01-27T15:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.427581 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.427631 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.427640 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.427660 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.427669 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:26Z","lastTransitionTime":"2026-01-27T15:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.531475 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.531535 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.531548 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.531567 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.531582 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:26Z","lastTransitionTime":"2026-01-27T15:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.634312 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.634371 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.634384 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.634406 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.634421 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:26Z","lastTransitionTime":"2026-01-27T15:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.737500 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.737560 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.737570 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.737591 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.737604 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:26Z","lastTransitionTime":"2026-01-27T15:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.840352 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.840735 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.840825 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.840920 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.840994 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:26Z","lastTransitionTime":"2026-01-27T15:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.904442 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 03:28:13.713987201 +0000 UTC Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.944365 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.944708 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.944821 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.944904 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:26 crc kubenswrapper[4713]: I0127 15:48:26.944986 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:26Z","lastTransitionTime":"2026-01-27T15:48:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.048190 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.048256 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.048264 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.048282 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.048293 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:27Z","lastTransitionTime":"2026-01-27T15:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.151220 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.151290 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.151339 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.151362 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.151379 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:27Z","lastTransitionTime":"2026-01-27T15:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.254655 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.255113 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.255241 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.255337 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.255604 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:27Z","lastTransitionTime":"2026-01-27T15:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.358444 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.358484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.358496 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.358512 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.358523 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:27Z","lastTransitionTime":"2026-01-27T15:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.462283 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.462323 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.462332 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.462349 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.462361 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:27Z","lastTransitionTime":"2026-01-27T15:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.565322 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.565368 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.565379 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.565396 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.565408 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:27Z","lastTransitionTime":"2026-01-27T15:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.668682 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.668726 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.668737 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.668755 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.668768 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:27Z","lastTransitionTime":"2026-01-27T15:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.772114 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.772159 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.772167 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.772182 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.772192 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:27Z","lastTransitionTime":"2026-01-27T15:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.874889 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.874943 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.874958 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.874981 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.874998 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:27Z","lastTransitionTime":"2026-01-27T15:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.898568 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.898596 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.898647 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.898683 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:27 crc kubenswrapper[4713]: E0127 15:48:27.899349 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:27 crc kubenswrapper[4713]: E0127 15:48:27.899510 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:27 crc kubenswrapper[4713]: E0127 15:48:27.899620 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:27 crc kubenswrapper[4713]: E0127 15:48:27.899859 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.904895 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 10:27:15.982544423 +0000 UTC Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.978525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.978587 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.978597 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.978616 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:27 crc kubenswrapper[4713]: I0127 15:48:27.978631 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:27Z","lastTransitionTime":"2026-01-27T15:48:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.082495 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.082558 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.082568 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.082586 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.082961 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:28Z","lastTransitionTime":"2026-01-27T15:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.186579 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.186641 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.186666 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.186695 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.186714 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:28Z","lastTransitionTime":"2026-01-27T15:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.289895 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.289929 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.289940 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.289959 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.289971 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:28Z","lastTransitionTime":"2026-01-27T15:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.393857 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.393907 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.393917 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.393936 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.393947 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:28Z","lastTransitionTime":"2026-01-27T15:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.497334 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.497385 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.497394 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.497415 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.497425 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:28Z","lastTransitionTime":"2026-01-27T15:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.600983 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.601070 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.601088 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.601115 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.601151 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:28Z","lastTransitionTime":"2026-01-27T15:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.704486 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.704541 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.704557 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.704582 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.704603 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:28Z","lastTransitionTime":"2026-01-27T15:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.807578 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.807616 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.807625 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.807645 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.807657 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:28Z","lastTransitionTime":"2026-01-27T15:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.900069 4713 scope.go:117] "RemoveContainer" containerID="bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8" Jan 27 15:48:28 crc kubenswrapper[4713]: E0127 15:48:28.900251 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.905423 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 22:59:18.689075899 +0000 UTC Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.910029 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.910155 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.910180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.910210 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:28 crc kubenswrapper[4713]: I0127 15:48:28.910233 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:28Z","lastTransitionTime":"2026-01-27T15:48:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.014456 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.015011 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.015373 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.015641 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.015784 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:29Z","lastTransitionTime":"2026-01-27T15:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.119068 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.119133 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.119156 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.119187 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.119211 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:29Z","lastTransitionTime":"2026-01-27T15:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.222785 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.222839 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.222856 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.222880 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.222902 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:29Z","lastTransitionTime":"2026-01-27T15:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.325677 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.325733 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.325752 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.325776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.325793 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:29Z","lastTransitionTime":"2026-01-27T15:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.429419 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.429496 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.429535 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.429571 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.429596 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:29Z","lastTransitionTime":"2026-01-27T15:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.490996 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.491093 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.491106 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.491126 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.491138 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:29Z","lastTransitionTime":"2026-01-27T15:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:29 crc kubenswrapper[4713]: E0127 15:48:29.509212 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.513160 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.513219 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.513228 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.513246 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.513260 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:29Z","lastTransitionTime":"2026-01-27T15:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:29 crc kubenswrapper[4713]: E0127 15:48:29.529021 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.533638 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.533707 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.533719 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.533749 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.533762 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:29Z","lastTransitionTime":"2026-01-27T15:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:29 crc kubenswrapper[4713]: E0127 15:48:29.556996 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.561890 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.561955 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.561966 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.561988 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.562001 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:29Z","lastTransitionTime":"2026-01-27T15:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:29 crc kubenswrapper[4713]: E0127 15:48:29.580448 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.585305 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.585348 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.585358 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.585374 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.585385 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:29Z","lastTransitionTime":"2026-01-27T15:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:29 crc kubenswrapper[4713]: E0127 15:48:29.607728 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:29Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:29 crc kubenswrapper[4713]: E0127 15:48:29.607920 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.609663 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.609692 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.609701 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.609717 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.609728 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:29Z","lastTransitionTime":"2026-01-27T15:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.713064 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.713110 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.713124 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.713146 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.713162 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:29Z","lastTransitionTime":"2026-01-27T15:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.816701 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.816746 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.816755 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.816772 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.816784 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:29Z","lastTransitionTime":"2026-01-27T15:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.898822 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:29 crc kubenswrapper[4713]: E0127 15:48:29.898981 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.898823 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.898846 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:29 crc kubenswrapper[4713]: E0127 15:48:29.899163 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.898822 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:29 crc kubenswrapper[4713]: E0127 15:48:29.899206 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:29 crc kubenswrapper[4713]: E0127 15:48:29.899281 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.905888 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-15 14:14:03.082848875 +0000 UTC Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.918412 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.918447 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.918458 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.918474 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:29 crc kubenswrapper[4713]: I0127 15:48:29.918484 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:29Z","lastTransitionTime":"2026-01-27T15:48:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.021325 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.021373 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.021389 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.021408 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.021420 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:30Z","lastTransitionTime":"2026-01-27T15:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.124277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.124320 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.124333 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.124352 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.124365 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:30Z","lastTransitionTime":"2026-01-27T15:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.226762 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.226845 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.226856 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.226876 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.226888 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:30Z","lastTransitionTime":"2026-01-27T15:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.330016 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.330074 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.330082 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.330097 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.330107 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:30Z","lastTransitionTime":"2026-01-27T15:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.432906 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.433102 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.433120 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.433141 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.433154 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:30Z","lastTransitionTime":"2026-01-27T15:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.535895 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.536023 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.536084 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.536120 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.536160 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:30Z","lastTransitionTime":"2026-01-27T15:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.639866 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.640018 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.640087 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.640117 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.640138 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:30Z","lastTransitionTime":"2026-01-27T15:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.742556 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.742595 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.742604 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.742620 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.742631 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:30Z","lastTransitionTime":"2026-01-27T15:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.845974 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.846026 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.846063 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.846089 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.846103 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:30Z","lastTransitionTime":"2026-01-27T15:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.906411 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 08:02:09.17520836 +0000 UTC Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.948651 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.948742 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.948752 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.948772 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:30 crc kubenswrapper[4713]: I0127 15:48:30.948803 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:30Z","lastTransitionTime":"2026-01-27T15:48:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.051827 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.051886 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.051904 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.051926 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.051939 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:31Z","lastTransitionTime":"2026-01-27T15:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.155064 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.155114 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.155127 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.155144 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.155155 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:31Z","lastTransitionTime":"2026-01-27T15:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.258360 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.258416 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.258430 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.258454 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.258468 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:31Z","lastTransitionTime":"2026-01-27T15:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.361199 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.361327 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.361343 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.361362 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.361373 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:31Z","lastTransitionTime":"2026-01-27T15:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.464360 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.464406 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.464422 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.464443 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.464457 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:31Z","lastTransitionTime":"2026-01-27T15:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.567124 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.567167 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.567186 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.567212 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.567224 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:31Z","lastTransitionTime":"2026-01-27T15:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.670356 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.670736 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.670829 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.670928 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.671061 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:31Z","lastTransitionTime":"2026-01-27T15:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.773575 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.773914 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.774013 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.774129 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.774212 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:31Z","lastTransitionTime":"2026-01-27T15:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.879787 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.879983 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.879999 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.880058 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.880076 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:31Z","lastTransitionTime":"2026-01-27T15:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.898627 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.898698 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:31 crc kubenswrapper[4713]: E0127 15:48:31.898822 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.898839 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:31 crc kubenswrapper[4713]: E0127 15:48:31.898961 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:31 crc kubenswrapper[4713]: E0127 15:48:31.899079 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.899379 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:31 crc kubenswrapper[4713]: E0127 15:48:31.899632 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.907544 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 15:50:46.306601123 +0000 UTC Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.983179 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.983220 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.983232 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.983249 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:31 crc kubenswrapper[4713]: I0127 15:48:31.983264 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:31Z","lastTransitionTime":"2026-01-27T15:48:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.086449 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.086489 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.086500 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.086515 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.086524 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:32Z","lastTransitionTime":"2026-01-27T15:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.189457 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.189506 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.189520 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.189539 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.189551 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:32Z","lastTransitionTime":"2026-01-27T15:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.291744 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.291812 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.291830 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.291860 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.291881 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:32Z","lastTransitionTime":"2026-01-27T15:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.394817 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.394877 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.394888 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.394909 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.394924 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:32Z","lastTransitionTime":"2026-01-27T15:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.498772 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.498857 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.498878 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.498905 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.498925 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:32Z","lastTransitionTime":"2026-01-27T15:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.601636 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.601682 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.601693 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.601714 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.601725 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:32Z","lastTransitionTime":"2026-01-27T15:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.705062 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.705142 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.705152 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.705169 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.705179 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:32Z","lastTransitionTime":"2026-01-27T15:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.808647 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.808683 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.808693 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.808710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.808722 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:32Z","lastTransitionTime":"2026-01-27T15:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.908303 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 23:41:25.935744595 +0000 UTC Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.911296 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.911333 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.911344 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.911363 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.911377 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:32Z","lastTransitionTime":"2026-01-27T15:48:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.913905 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d449c12-7b67-4bee-9ce4-66a208d06c54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2518ddc03128d7979d73c3f98bd42c82e8bf4a79a04827d79c8b628d71d99eee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5bb468807f63419d201af98fac94f10979f6f52f9111f699d80271ab3d93ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7869feca99a1ab37cbd376ff4c7255b0e61aa393f2d65d27035c8e962c3ed24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.927106 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.937550 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.951206 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.962840 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.974659 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:32 crc kubenswrapper[4713]: I0127 15:48:32.991912 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:32Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.010022 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30964991c715fef1f0e7ee97a2a0bbbab1a671ddbd347113c96a318b76d91bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:23Z\\\",\\\"message\\\":\\\"2026-01-27T15:47:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8fa15a8c-4373-4269-8f8e-a02eb250f1c3\\\\n2026-01-27T15:47:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8fa15a8c-4373-4269-8f8e-a02eb250f1c3 to /host/opt/cni/bin/\\\\n2026-01-27T15:47:38Z [verbose] multus-daemon started\\\\n2026-01-27T15:47:38Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:48:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.013605 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.013756 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.013816 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.013902 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.013970 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:33Z","lastTransitionTime":"2026-01-27T15:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.025785 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.044529 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:12Z\\\",\\\"message\\\":\\\"0127 15:48:12.771294 6448 obj_retry.go:409] Going to retry *v1.Pod resource setup for 16 objects: [openshift-multus/multus-additional-cni-plugins-gb57w openshift-network-operator/iptables-alerter-4ln5h openshift-kube-apiserver/kube-apiserver-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm openshift-etcd/etcd-crc openshift-image-registry/node-ca-4l8xv openshift-multus/network-metrics-daemon-mdw5k openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-xs9tk openshift-machine-config-operator/machine-config-daemon-6h5wz openshift-multus/multus-n7wxq openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-kube-controller-manager/kube-controller-manager-crc]\\\\nI0127 15:48:12.771329 6448 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0127 15:48:12.771336 6448 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:48:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.060445 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.076237 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.088559 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.108320 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.122276 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.122323 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.122337 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.122368 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.122384 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:33Z","lastTransitionTime":"2026-01-27T15:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.125596 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.140074 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.151967 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.161486 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:33Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.225615 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.225702 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.225737 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.225771 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.225790 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:33Z","lastTransitionTime":"2026-01-27T15:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.328751 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.328814 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.328836 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.328863 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.328882 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:33Z","lastTransitionTime":"2026-01-27T15:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.432513 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.432557 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.432569 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.432587 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.432599 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:33Z","lastTransitionTime":"2026-01-27T15:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.535569 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.535617 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.535630 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.535654 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.535672 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:33Z","lastTransitionTime":"2026-01-27T15:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.638690 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.638760 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.638780 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.638807 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.638823 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:33Z","lastTransitionTime":"2026-01-27T15:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.741525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.741569 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.741584 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.741605 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.741620 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:33Z","lastTransitionTime":"2026-01-27T15:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.844910 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.845016 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.845091 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.845122 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.845141 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:33Z","lastTransitionTime":"2026-01-27T15:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.898846 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.898891 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.898914 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.898974 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:33 crc kubenswrapper[4713]: E0127 15:48:33.899073 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:33 crc kubenswrapper[4713]: E0127 15:48:33.899106 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:33 crc kubenswrapper[4713]: E0127 15:48:33.899171 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:33 crc kubenswrapper[4713]: E0127 15:48:33.899318 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.909116 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-03 04:15:03.759006659 +0000 UTC Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.948160 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.948244 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.948262 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.948286 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:33 crc kubenswrapper[4713]: I0127 15:48:33.948302 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:33Z","lastTransitionTime":"2026-01-27T15:48:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.051621 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.051672 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.051684 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.051704 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.051717 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:34Z","lastTransitionTime":"2026-01-27T15:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.154518 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.154564 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.154578 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.154597 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.154611 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:34Z","lastTransitionTime":"2026-01-27T15:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.258122 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.258199 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.258219 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.258244 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.258266 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:34Z","lastTransitionTime":"2026-01-27T15:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.362134 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.362194 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.362211 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.362237 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.362257 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:34Z","lastTransitionTime":"2026-01-27T15:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.464901 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.464953 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.464971 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.464998 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.465021 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:34Z","lastTransitionTime":"2026-01-27T15:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.568511 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.568564 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.568577 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.568596 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.568608 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:34Z","lastTransitionTime":"2026-01-27T15:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.671569 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.671629 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.671641 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.671668 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.671681 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:34Z","lastTransitionTime":"2026-01-27T15:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.774621 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.774709 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.774743 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.774777 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.774800 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:34Z","lastTransitionTime":"2026-01-27T15:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.877889 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.877944 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.877952 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.877971 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.877982 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:34Z","lastTransitionTime":"2026-01-27T15:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.909319 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 00:50:05.669769247 +0000 UTC Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.981032 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.981150 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.981173 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.981203 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:34 crc kubenswrapper[4713]: I0127 15:48:34.981228 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:34Z","lastTransitionTime":"2026-01-27T15:48:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.084471 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.084525 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.084537 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.084559 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.084575 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:35Z","lastTransitionTime":"2026-01-27T15:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.187437 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.187481 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.187490 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.187510 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.187520 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:35Z","lastTransitionTime":"2026-01-27T15:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.290220 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.290290 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.290308 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.290334 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.290352 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:35Z","lastTransitionTime":"2026-01-27T15:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.392702 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.392742 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.392751 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.392768 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.392778 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:35Z","lastTransitionTime":"2026-01-27T15:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.495177 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.495680 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.495701 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.495720 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.495732 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:35Z","lastTransitionTime":"2026-01-27T15:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.600200 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.600275 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.600303 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.600338 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.600363 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:35Z","lastTransitionTime":"2026-01-27T15:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.702942 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.703023 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.703069 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.703096 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.703119 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:35Z","lastTransitionTime":"2026-01-27T15:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.731722 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.731920 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.731878927 +0000 UTC m=+147.510088915 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.806112 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.806155 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.806165 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.806182 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.806194 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:35Z","lastTransitionTime":"2026-01-27T15:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.832903 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.832957 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.832991 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.833022 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.833144 4713 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.833176 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.833202 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.833217 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.833226 4713 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.833238 4713 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.833241 4713 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.833216 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.833196125 +0000 UTC m=+147.611406063 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.833355 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.833311769 +0000 UTC m=+147.611521717 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.833391 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.833382601 +0000 UTC m=+147.611592659 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.833445 4713 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.833494 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.833484774 +0000 UTC m=+147.611694712 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.899080 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.899148 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.899177 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.899189 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.899285 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.899375 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.899543 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:35 crc kubenswrapper[4713]: E0127 15:48:35.899702 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.909540 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-03 04:49:51.468192061 +0000 UTC Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.909750 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.909777 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.909788 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.909803 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:35 crc kubenswrapper[4713]: I0127 15:48:35.909816 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:35Z","lastTransitionTime":"2026-01-27T15:48:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.012644 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.012700 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.012713 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.012735 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.012749 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:36Z","lastTransitionTime":"2026-01-27T15:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.115905 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.115960 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.115973 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.115994 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.116009 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:36Z","lastTransitionTime":"2026-01-27T15:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.219432 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.219509 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.219521 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.219540 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.219553 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:36Z","lastTransitionTime":"2026-01-27T15:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.323394 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.323454 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.323466 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.323486 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.323502 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:36Z","lastTransitionTime":"2026-01-27T15:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.426587 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.426636 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.426649 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.426668 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.426681 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:36Z","lastTransitionTime":"2026-01-27T15:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.529426 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.529485 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.529498 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.529515 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.529526 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:36Z","lastTransitionTime":"2026-01-27T15:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.632100 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.632138 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.632149 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.632166 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.632178 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:36Z","lastTransitionTime":"2026-01-27T15:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.735185 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.735220 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.735228 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.735244 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.735257 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:36Z","lastTransitionTime":"2026-01-27T15:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.838537 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.838577 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.838588 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.838608 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.838619 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:36Z","lastTransitionTime":"2026-01-27T15:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.909328 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.910083 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 07:34:06.127010953 +0000 UTC Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.940629 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.940693 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.940712 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.940737 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:36 crc kubenswrapper[4713]: I0127 15:48:36.940757 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:36Z","lastTransitionTime":"2026-01-27T15:48:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.043862 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.043937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.043956 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.043985 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.044004 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:37Z","lastTransitionTime":"2026-01-27T15:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.146915 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.146974 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.146986 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.147003 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.147017 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:37Z","lastTransitionTime":"2026-01-27T15:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.249979 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.250093 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.250121 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.250154 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.250178 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:37Z","lastTransitionTime":"2026-01-27T15:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.352745 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.352799 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.352811 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.352834 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.352850 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:37Z","lastTransitionTime":"2026-01-27T15:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.455552 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.455597 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.455609 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.455628 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.455638 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:37Z","lastTransitionTime":"2026-01-27T15:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.558598 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.558651 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.558663 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.558684 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.558700 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:37Z","lastTransitionTime":"2026-01-27T15:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.662680 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.662745 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.662762 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.662785 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.662798 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:37Z","lastTransitionTime":"2026-01-27T15:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.765811 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.765881 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.765900 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.765932 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.765954 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:37Z","lastTransitionTime":"2026-01-27T15:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.869255 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.869306 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.869316 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.869333 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.869348 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:37Z","lastTransitionTime":"2026-01-27T15:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.898875 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.898937 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:37 crc kubenswrapper[4713]: E0127 15:48:37.899089 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.899103 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.899130 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:37 crc kubenswrapper[4713]: E0127 15:48:37.899192 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:37 crc kubenswrapper[4713]: E0127 15:48:37.899267 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:37 crc kubenswrapper[4713]: E0127 15:48:37.899334 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.910913 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 19:55:23.584778186 +0000 UTC Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.972339 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.972401 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.972418 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.972444 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:37 crc kubenswrapper[4713]: I0127 15:48:37.972462 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:37Z","lastTransitionTime":"2026-01-27T15:48:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.075526 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.075562 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.075572 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.075588 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.075599 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:38Z","lastTransitionTime":"2026-01-27T15:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.178261 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.178325 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.178339 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.178368 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.178396 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:38Z","lastTransitionTime":"2026-01-27T15:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.282202 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.282285 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.282310 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.282341 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.282364 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:38Z","lastTransitionTime":"2026-01-27T15:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.384706 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.385133 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.385215 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.385311 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.385383 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:38Z","lastTransitionTime":"2026-01-27T15:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.488397 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.488737 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.488833 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.488917 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.488986 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:38Z","lastTransitionTime":"2026-01-27T15:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.591819 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.591871 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.591886 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.591906 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.591921 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:38Z","lastTransitionTime":"2026-01-27T15:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.694297 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.694370 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.694385 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.694409 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.694420 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:38Z","lastTransitionTime":"2026-01-27T15:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.797931 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.797992 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.798004 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.798027 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.798063 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:38Z","lastTransitionTime":"2026-01-27T15:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.901080 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.901130 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.901141 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.901157 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.901170 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:38Z","lastTransitionTime":"2026-01-27T15:48:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:38 crc kubenswrapper[4713]: I0127 15:48:38.911094 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 14:35:28.134683021 +0000 UTC Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.003980 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.004019 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.004027 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.004055 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.004090 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:39Z","lastTransitionTime":"2026-01-27T15:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.107078 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.107438 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.107504 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.107579 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.107654 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:39Z","lastTransitionTime":"2026-01-27T15:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.210524 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.210841 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.210947 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.211024 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.211115 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:39Z","lastTransitionTime":"2026-01-27T15:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.313715 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.314125 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.314207 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.314277 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.314353 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:39Z","lastTransitionTime":"2026-01-27T15:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.417159 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.417206 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.417221 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.417244 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.417259 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:39Z","lastTransitionTime":"2026-01-27T15:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.519898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.520236 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.520353 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.520442 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.520532 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:39Z","lastTransitionTime":"2026-01-27T15:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.627260 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.627316 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.627330 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.627348 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.627358 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:39Z","lastTransitionTime":"2026-01-27T15:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.710721 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.711105 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.711205 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.711313 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.711408 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:39Z","lastTransitionTime":"2026-01-27T15:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:39 crc kubenswrapper[4713]: E0127 15:48:39.723413 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.727238 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.727285 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.727300 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.727318 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.727333 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:39Z","lastTransitionTime":"2026-01-27T15:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:39 crc kubenswrapper[4713]: E0127 15:48:39.740646 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.745443 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.745506 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.745522 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.745545 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.745558 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:39Z","lastTransitionTime":"2026-01-27T15:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:39 crc kubenswrapper[4713]: E0127 15:48:39.759949 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.764367 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.764561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.764625 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.764732 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.764802 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:39Z","lastTransitionTime":"2026-01-27T15:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:39 crc kubenswrapper[4713]: E0127 15:48:39.779978 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.783488 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.783538 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.783556 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.783582 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.783599 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:39Z","lastTransitionTime":"2026-01-27T15:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:39 crc kubenswrapper[4713]: E0127 15:48:39.796071 4713 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"12db812d-bafb-428d-b4ce-76a18a0ee412\\\",\\\"systemUUID\\\":\\\"dfac922c-dc6b-48c7-bc81-f26a3c211b98\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:39Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:39 crc kubenswrapper[4713]: E0127 15:48:39.796209 4713 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.798305 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.798360 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.798372 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.798390 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.798403 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:39Z","lastTransitionTime":"2026-01-27T15:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.898754 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.898800 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:39 crc kubenswrapper[4713]: E0127 15:48:39.899440 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.899434 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.898871 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:39 crc kubenswrapper[4713]: E0127 15:48:39.899675 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:39 crc kubenswrapper[4713]: E0127 15:48:39.899789 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:39 crc kubenswrapper[4713]: E0127 15:48:39.899897 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.900512 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.900545 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.900556 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.900572 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.900582 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:39Z","lastTransitionTime":"2026-01-27T15:48:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:39 crc kubenswrapper[4713]: I0127 15:48:39.911711 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 04:34:07.724382622 +0000 UTC Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.003759 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.003807 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.003819 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.003835 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.003848 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:40Z","lastTransitionTime":"2026-01-27T15:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.107507 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.107564 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.107581 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.107604 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.107619 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:40Z","lastTransitionTime":"2026-01-27T15:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.211383 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.211729 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.212252 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.212380 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.212462 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:40Z","lastTransitionTime":"2026-01-27T15:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.315540 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.315606 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.315626 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.315660 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.315684 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:40Z","lastTransitionTime":"2026-01-27T15:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.418412 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.418476 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.418489 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.418510 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.418525 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:40Z","lastTransitionTime":"2026-01-27T15:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.521096 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.521158 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.521167 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.521186 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.521200 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:40Z","lastTransitionTime":"2026-01-27T15:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.623589 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.623654 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.623665 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.623692 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.623705 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:40Z","lastTransitionTime":"2026-01-27T15:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.727442 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.727515 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.727534 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.727563 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.727581 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:40Z","lastTransitionTime":"2026-01-27T15:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.830638 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.830688 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.830697 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.830719 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.830730 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:40Z","lastTransitionTime":"2026-01-27T15:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.912292 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 11:54:55.95513095 +0000 UTC Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.933937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.933992 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.934007 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.934026 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:40 crc kubenswrapper[4713]: I0127 15:48:40.934065 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:40Z","lastTransitionTime":"2026-01-27T15:48:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.037233 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.037297 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.037305 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.037324 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.037335 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:41Z","lastTransitionTime":"2026-01-27T15:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.140797 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.140891 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.140905 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.140921 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.140936 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:41Z","lastTransitionTime":"2026-01-27T15:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.243492 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.243563 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.243577 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.243598 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.243616 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:41Z","lastTransitionTime":"2026-01-27T15:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.346903 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.346951 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.346971 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.346991 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.347004 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:41Z","lastTransitionTime":"2026-01-27T15:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.449687 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.449760 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.449770 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.449789 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.449802 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:41Z","lastTransitionTime":"2026-01-27T15:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.552583 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.552626 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.552668 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.552688 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.552700 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:41Z","lastTransitionTime":"2026-01-27T15:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.655604 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.655688 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.655704 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.655722 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.655735 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:41Z","lastTransitionTime":"2026-01-27T15:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.759560 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.759629 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.759639 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.759668 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.759681 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:41Z","lastTransitionTime":"2026-01-27T15:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.862610 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.862678 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.862698 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.862728 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.862747 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:41Z","lastTransitionTime":"2026-01-27T15:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.898673 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.898763 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.898667 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:41 crc kubenswrapper[4713]: E0127 15:48:41.898963 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.898694 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:41 crc kubenswrapper[4713]: E0127 15:48:41.899326 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:41 crc kubenswrapper[4713]: E0127 15:48:41.899303 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:41 crc kubenswrapper[4713]: E0127 15:48:41.898817 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.912974 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 13:16:14.836968365 +0000 UTC Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.965443 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.965491 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.965499 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.965516 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:41 crc kubenswrapper[4713]: I0127 15:48:41.965526 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:41Z","lastTransitionTime":"2026-01-27T15:48:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.068466 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.068510 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.068519 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.068536 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.068546 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:42Z","lastTransitionTime":"2026-01-27T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.170559 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.170605 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.170616 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.170634 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.170648 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:42Z","lastTransitionTime":"2026-01-27T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.273135 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.273207 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.273222 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.273241 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.273253 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:42Z","lastTransitionTime":"2026-01-27T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.376309 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.376381 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.376399 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.376425 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.376446 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:42Z","lastTransitionTime":"2026-01-27T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.479674 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.479715 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.479726 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.479744 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.479759 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:42Z","lastTransitionTime":"2026-01-27T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.582760 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.582831 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.582851 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.582892 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.582930 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:42Z","lastTransitionTime":"2026-01-27T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.686650 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.686730 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.686748 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.686776 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.686797 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:42Z","lastTransitionTime":"2026-01-27T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.790046 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.790088 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.790096 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.790111 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.790123 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:42Z","lastTransitionTime":"2026-01-27T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.893603 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.893651 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.893659 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.893678 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.893687 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:42Z","lastTransitionTime":"2026-01-27T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.899960 4713 scope.go:117] "RemoveContainer" containerID="bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.913191 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 05:26:07.400652308 +0000 UTC Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.913397 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.930297 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.942081 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.961816 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.981104 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.996178 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:42Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.998749 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.998810 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.998824 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.998845 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:42 crc kubenswrapper[4713]: I0127 15:48:42.998858 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:42Z","lastTransitionTime":"2026-01-27T15:48:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.010937 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.023254 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.034656 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d449c12-7b67-4bee-9ce4-66a208d06c54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2518ddc03128d7979d73c3f98bd42c82e8bf4a79a04827d79c8b628d71d99eee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5bb468807f63419d201af98fac94f10979f6f52f9111f699d80271ab3d93ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7869feca99a1ab37cbd376ff4c7255b0e61aa393f2d65d27035c8e962c3ed24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.050194 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a843dc7-1222-4c17-ac5b-f332125959a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a2ab8d705c4225fb86d376b6ed5e6c87fcdb23e253f4dad48f24b180aeaf191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfcb6b7dc5d545c8dc94df8a83007e065fe052bbaf571f405ae1b7ea449f096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfcb6b7dc5d545c8dc94df8a83007e065fe052bbaf571f405ae1b7ea449f096e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.064085 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.079216 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.101012 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:12Z\\\",\\\"message\\\":\\\"0127 15:48:12.771294 6448 obj_retry.go:409] Going to retry *v1.Pod resource setup for 16 objects: [openshift-multus/multus-additional-cni-plugins-gb57w openshift-network-operator/iptables-alerter-4ln5h openshift-kube-apiserver/kube-apiserver-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm openshift-etcd/etcd-crc openshift-image-registry/node-ca-4l8xv openshift-multus/network-metrics-daemon-mdw5k openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-xs9tk openshift-machine-config-operator/machine-config-daemon-6h5wz openshift-multus/multus-n7wxq openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-kube-controller-manager/kube-controller-manager-crc]\\\\nI0127 15:48:12.771329 6448 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0127 15:48:12.771336 6448 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:48:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.101749 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.101791 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.101804 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.101824 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.101840 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:43Z","lastTransitionTime":"2026-01-27T15:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.112990 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.124961 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.137454 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.150523 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30964991c715fef1f0e7ee97a2a0bbbab1a671ddbd347113c96a318b76d91bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:23Z\\\",\\\"message\\\":\\\"2026-01-27T15:47:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8fa15a8c-4373-4269-8f8e-a02eb250f1c3\\\\n2026-01-27T15:47:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8fa15a8c-4373-4269-8f8e-a02eb250f1c3 to /host/opt/cni/bin/\\\\n2026-01-27T15:47:38Z [verbose] multus-daemon started\\\\n2026-01-27T15:47:38Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:48:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.163990 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.176172 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.204565 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.204612 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.204622 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.204640 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.204651 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:43Z","lastTransitionTime":"2026-01-27T15:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.306685 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.306715 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.306725 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.306742 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.306755 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:43Z","lastTransitionTime":"2026-01-27T15:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.404452 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/2.log" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.408910 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.408946 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.408954 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.408970 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.408980 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:43Z","lastTransitionTime":"2026-01-27T15:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.409694 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerStarted","Data":"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10"} Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.410748 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.423989 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d449c12-7b67-4bee-9ce4-66a208d06c54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2518ddc03128d7979d73c3f98bd42c82e8bf4a79a04827d79c8b628d71d99eee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5bb468807f63419d201af98fac94f10979f6f52f9111f699d80271ab3d93ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7869feca99a1ab37cbd376ff4c7255b0e61aa393f2d65d27035c8e962c3ed24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.434937 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a843dc7-1222-4c17-ac5b-f332125959a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a2ab8d705c4225fb86d376b6ed5e6c87fcdb23e253f4dad48f24b180aeaf191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfcb6b7dc5d545c8dc94df8a83007e065fe052bbaf571f405ae1b7ea449f096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfcb6b7dc5d545c8dc94df8a83007e065fe052bbaf571f405ae1b7ea449f096e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.448180 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.461311 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.474991 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.489645 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.503656 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.511230 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.511542 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.511672 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.511812 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.511900 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:43Z","lastTransitionTime":"2026-01-27T15:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.528224 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30964991c715fef1f0e7ee97a2a0bbbab1a671ddbd347113c96a318b76d91bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:23Z\\\",\\\"message\\\":\\\"2026-01-27T15:47:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8fa15a8c-4373-4269-8f8e-a02eb250f1c3\\\\n2026-01-27T15:47:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8fa15a8c-4373-4269-8f8e-a02eb250f1c3 to /host/opt/cni/bin/\\\\n2026-01-27T15:47:38Z [verbose] multus-daemon started\\\\n2026-01-27T15:47:38Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:48:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.554605 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.588420 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:12Z\\\",\\\"message\\\":\\\"0127 15:48:12.771294 6448 obj_retry.go:409] Going to retry *v1.Pod resource setup for 16 objects: [openshift-multus/multus-additional-cni-plugins-gb57w openshift-network-operator/iptables-alerter-4ln5h openshift-kube-apiserver/kube-apiserver-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm openshift-etcd/etcd-crc openshift-image-registry/node-ca-4l8xv openshift-multus/network-metrics-daemon-mdw5k openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-xs9tk openshift-machine-config-operator/machine-config-daemon-6h5wz openshift-multus/multus-n7wxq openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-kube-controller-manager/kube-controller-manager-crc]\\\\nI0127 15:48:12.771329 6448 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0127 15:48:12.771336 6448 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:48:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.603659 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.614564 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.614938 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.615078 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.615159 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.615233 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:43Z","lastTransitionTime":"2026-01-27T15:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.624286 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.638284 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.663330 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.677167 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.689498 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.702115 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.711707 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.717332 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.717377 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.717389 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.717419 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.717434 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:43Z","lastTransitionTime":"2026-01-27T15:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.723336 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.819804 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.819874 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.819885 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.819899 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.819912 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:43Z","lastTransitionTime":"2026-01-27T15:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.899454 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:43 crc kubenswrapper[4713]: E0127 15:48:43.899619 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.900134 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.900166 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.900184 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:43 crc kubenswrapper[4713]: E0127 15:48:43.900744 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:43 crc kubenswrapper[4713]: E0127 15:48:43.900981 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:43 crc kubenswrapper[4713]: E0127 15:48:43.901144 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.913480 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 23:59:34.306592766 +0000 UTC Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.922808 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.922874 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.922886 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.922905 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:43 crc kubenswrapper[4713]: I0127 15:48:43.922922 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:43Z","lastTransitionTime":"2026-01-27T15:48:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.025464 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.025521 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.025534 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.025553 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.025569 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:44Z","lastTransitionTime":"2026-01-27T15:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.128154 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.128232 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.128255 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.128290 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.128316 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:44Z","lastTransitionTime":"2026-01-27T15:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.232582 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.232653 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.232691 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.232729 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.232755 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:44Z","lastTransitionTime":"2026-01-27T15:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.336703 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.336760 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.336779 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.336804 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.336825 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:44Z","lastTransitionTime":"2026-01-27T15:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.415332 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/3.log" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.415883 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/2.log" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.418476 4713 generic.go:334] "Generic (PLEG): container finished" podID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerID="e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10" exitCode=1 Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.418520 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerDied","Data":"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10"} Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.418559 4713 scope.go:117] "RemoveContainer" containerID="bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.419471 4713 scope.go:117] "RemoveContainer" containerID="e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10" Jan 27 15:48:44 crc kubenswrapper[4713]: E0127 15:48:44.419684 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.440379 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.440450 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.440464 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.440485 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.440523 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:44Z","lastTransitionTime":"2026-01-27T15:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.441722 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.462493 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.474116 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.488626 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.509952 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.524981 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.538186 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.543790 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.543923 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.544005 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.544104 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.544184 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:44Z","lastTransitionTime":"2026-01-27T15:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.551752 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.564589 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.576844 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d449c12-7b67-4bee-9ce4-66a208d06c54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2518ddc03128d7979d73c3f98bd42c82e8bf4a79a04827d79c8b628d71d99eee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5bb468807f63419d201af98fac94f10979f6f52f9111f699d80271ab3d93ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7869feca99a1ab37cbd376ff4c7255b0e61aa393f2d65d27035c8e962c3ed24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.593726 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a843dc7-1222-4c17-ac5b-f332125959a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a2ab8d705c4225fb86d376b6ed5e6c87fcdb23e253f4dad48f24b180aeaf191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfcb6b7dc5d545c8dc94df8a83007e065fe052bbaf571f405ae1b7ea449f096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfcb6b7dc5d545c8dc94df8a83007e065fe052bbaf571f405ae1b7ea449f096e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.614699 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.627107 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.646943 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.647010 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.647020 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.647053 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.647072 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:44Z","lastTransitionTime":"2026-01-27T15:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.650602 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bad9b23f19a2a4f2199d4f2e44908da7b95d6608df4b82551f90deb49f96a7b8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:12Z\\\",\\\"message\\\":\\\"0127 15:48:12.771294 6448 obj_retry.go:409] Going to retry *v1.Pod resource setup for 16 objects: [openshift-multus/multus-additional-cni-plugins-gb57w openshift-network-operator/iptables-alerter-4ln5h openshift-kube-apiserver/kube-apiserver-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm openshift-etcd/etcd-crc openshift-image-registry/node-ca-4l8xv openshift-multus/network-metrics-daemon-mdw5k openshift-network-diagnostics/network-check-source-55646444c4-trplf openshift-network-diagnostics/network-check-target-xd92c openshift-network-node-identity/network-node-identity-vrzqb openshift-ovn-kubernetes/ovnkube-node-xs9tk openshift-machine-config-operator/machine-config-daemon-6h5wz openshift-multus/multus-n7wxq openshift-network-console/networking-console-plugin-85b44fc459-gdk6g openshift-kube-controller-manager/kube-controller-manager-crc]\\\\nI0127 15:48:12.771329 6448 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nF0127 15:48:12.771336 6448 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:48:12Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:43Z\\\",\\\"message\\\":\\\":43.760940 6897 services_controller.go:445] Built service openshift-kube-storage-version-migrator-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0127 15:48:43.760956 6897 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z]\\\\nI0127 15:48:43.760952 6897 services_controller.go:451] Built service openshift-kube-storage-version-migrator-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:48:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.665991 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.684171 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.696972 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.710980 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30964991c715fef1f0e7ee97a2a0bbbab1a671ddbd347113c96a318b76d91bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:23Z\\\",\\\"message\\\":\\\"2026-01-27T15:47:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8fa15a8c-4373-4269-8f8e-a02eb250f1c3\\\\n2026-01-27T15:47:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8fa15a8c-4373-4269-8f8e-a02eb250f1c3 to /host/opt/cni/bin/\\\\n2026-01-27T15:47:38Z [verbose] multus-daemon started\\\\n2026-01-27T15:47:38Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:48:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.725524 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:44Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.749846 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.749893 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.749906 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.749924 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.749936 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:44Z","lastTransitionTime":"2026-01-27T15:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.853296 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.853369 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.853390 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.853419 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.853442 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:44Z","lastTransitionTime":"2026-01-27T15:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.913795 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 00:19:34.791844126 +0000 UTC Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.956657 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.956701 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.956710 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.956728 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:44 crc kubenswrapper[4713]: I0127 15:48:44.956739 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:44Z","lastTransitionTime":"2026-01-27T15:48:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.059709 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.059754 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.059765 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.059783 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.059796 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:45Z","lastTransitionTime":"2026-01-27T15:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.162857 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.162905 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.162938 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.162957 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.162971 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:45Z","lastTransitionTime":"2026-01-27T15:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.265690 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.265743 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.265766 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.265793 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.265808 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:45Z","lastTransitionTime":"2026-01-27T15:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.368185 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.368234 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.368243 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.368266 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.368278 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:45Z","lastTransitionTime":"2026-01-27T15:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.424837 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/3.log" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.429659 4713 scope.go:117] "RemoveContainer" containerID="e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10" Jan 27 15:48:45 crc kubenswrapper[4713]: E0127 15:48:45.429850 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.463506 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"06a1f962-cd6f-4b6e-aae9-c15250d36662\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13755062d573f3ab2294dc604fe19f83eedd2abd48682ec7ae903223cfb05642\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0971c2428677ec8b52f1bf8dedc6b8b1d405194b416a45d9de31f6b36c48e761\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d8549d10b02863aef818fa94a9502d9c1d7fafc16dbffe8bacfcc918d97be4c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://4201ebfb2c04906d4e7a4e292bb5efad29536b3dadee2e856d217678aaede371\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://49e52d8a4930e0d55ceef3b1e8fbed6c192b6c36cb664f701f89f20d9ffc2d7d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d66a91aff7fdef7a00763124585bf4ab3c24a6ea995de4d151c3f6882bb1497d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://58ebe5e54821a686535400519fb0d10295e99f33736e968504c3a40959c264c0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd62d0988d878fbdfd2e7d8887d7722f56c95d3f4b30c2322e2230447896e8f8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.470514 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.470561 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.470573 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.470594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.470609 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:45Z","lastTransitionTime":"2026-01-27T15:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.477752 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://3dbcd10469ff6c80c036268e724763fe6b851c7f673e5b32cb414792cdc10c27\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.493821 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.507029 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:35Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://16393f24bc0b4c8774bcd0a5798919e617eddf4bd2c03fd29a51a2a5e0a490f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.518416 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-4l8xv" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8bdd78db-909c-421c-a332-af38a8a6ba00\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://39d9d6f20b81f957cdae3b412034f7b2d3f66b06916902e0326a4e232449f29d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qltjf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:39Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-4l8xv\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.531623 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"81e4c47d-af8d-44f0-beff-17cf5f133ff7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vbjvc\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:50Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-mdw5k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.551405 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3d449c12-7b67-4bee-9ce4-66a208d06c54\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2518ddc03128d7979d73c3f98bd42c82e8bf4a79a04827d79c8b628d71d99eee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea5bb468807f63419d201af98fac94f10979f6f52f9111f699d80271ab3d93ef\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7869feca99a1ab37cbd376ff4c7255b0e61aa393f2d65d27035c8e962c3ed24a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6c04971f4b00f90251935ab95003dfe418ec4db30799b9c1a4eb3f5c443ab60d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.562955 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0a843dc7-1222-4c17-ac5b-f332125959a4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:15Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8a2ab8d705c4225fb86d376b6ed5e6c87fcdb23e253f4dad48f24b180aeaf191\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dfcb6b7dc5d545c8dc94df8a83007e065fe052bbaf571f405ae1b7ea449f096e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://dfcb6b7dc5d545c8dc94df8a83007e065fe052bbaf571f405ae1b7ea449f096e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.573300 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.573370 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.573380 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.573395 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.573405 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:45Z","lastTransitionTime":"2026-01-27T15:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.580666 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:33Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d2c50ba219565cf30f60fd0d87a8ed856ab582f69fbc9c899bcf572d8715bea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1097839997a43e227b8b71ef2362a74f6c0ce03301a03b7ed9d19e8d0e5dd8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.592216 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-j5sgs" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7dab7b05-4eca-4f3d-b53d-ad1e0042cd55\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f9f6777ede766a220684817ff44d701170947e8d6f8ef19235fe3086ef871a39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-tp9nt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-j5sgs\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.603049 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9bd62e63-357b-4f16-a2a1-e6a1d2375808\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9539a85e149f1e498b94c5a72fb078c191fb1eda07df052f69daae393fb12039\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vj8qv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-6h5wz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.613927 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"64b537b7-4ed5-4dbb-b064-53eb12c2a2e8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0d94793c59baf8bc6c378dcd69d7989523efe0ebd9e3bb9773c76f158e234a5a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://54f364bc4e02c62433896377b0bf061379fbe2004f1edcab78e83b45bde8df26\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://792c81bd126ed4f22a51c1ace9f19355169beb9726ebbeef2d45c7b409d6cc1e\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.625685 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.640517 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-n7wxq" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bf07c585-f90e-4416-a66c-d41547008320\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:48:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://30964991c715fef1f0e7ee97a2a0bbbab1a671ddbd347113c96a318b76d91bfd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:23Z\\\",\\\"message\\\":\\\"2026-01-27T15:47:38+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_8fa15a8c-4373-4269-8f8e-a02eb250f1c3\\\\n2026-01-27T15:47:38+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_8fa15a8c-4373-4269-8f8e-a02eb250f1c3 to /host/opt/cni/bin/\\\\n2026-01-27T15:47:38Z [verbose] multus-daemon started\\\\n2026-01-27T15:47:38Z [verbose] Readiness Indicator file check\\\\n2026-01-27T15:48:23Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:48:24Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-86thf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:36Z\\\"}}\" for pod \"openshift-multus\"/\"multus-n7wxq\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.657109 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-gb57w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9e1879ed-280f-4fea-982c-6203ba438008\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5227d146fffc4bd0ba88a52c3f3ce9216f7a087950d3aca614bd095255f10fb9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7c647387365a7df57901e2446523e58d29d046a68b4146fcdfa69e42bf50953d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://78cb44932251c5f2eed2ca1f82e53293799fa929093f897aeeb9eb860b3cfe4b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:39Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8bc3d10fd21f2df6c30caa0a5c6e7865dac049b13400b1e496dcae0a4cc9592d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1a6e4d52e35dc213318d18765d153c93dabdd5272874af9ea030384b56d78fee\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f7a29aa98835556b6072f939a90d3311e27f7d39d495419665afb2bcd25620da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9ae1135fb21cd7e4576952ce5fb69bd5be0225e33016bfc26bf54a45957c2dba\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dqvr9\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-gb57w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.675349 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"131f5d56-4900-4558-abfa-24c9e999e5ad\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-27T15:48:43Z\\\",\\\"message\\\":\\\":43.760940 6897 services_controller.go:445] Built service openshift-kube-storage-version-migrator-operator/metrics LB template configs for network=default: []services.lbConfig(nil)\\\\nF0127 15:48:43.760956 6897 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:43Z is after 2025-08-24T17:21:41Z]\\\\nI0127 15:48:43.760952 6897 services_controller.go:451] Built service openshift-kube-storage-version-migrator-operator/metrics cluster-wide LB for network=default: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-kube-storage-version-migrator-operator/metrics_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:48:42Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:37Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5w9kd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:37Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-xs9tk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.675911 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.675929 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.675937 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.675951 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.675960 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:45Z","lastTransitionTime":"2026-01-27T15:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.688295 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6017cf1d-2f04-45b3-bc32-4a1cae590bc5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:48Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef563913a9310bc0f8ec97bc78f1d5fc0f524e4df14009e911231210ce50372b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://cf0499b5739298ebcf250a458b495c681600214621fc1c83b074b3ebe8c4a682\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9p5cj\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:48Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-cqmfm\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.706577 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c8671861-b7cf-4af6-96cc-fcde117c229e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:13Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"espace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": dial tcp [::1]:6443: connect: connection refused\\\\nI0127 15:47:16.388263 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0127 15:47:16.388800 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-897477557/tls.crt::/tmp/serving-cert-897477557/tls.key\\\\\\\"\\\\nI0127 15:47:31.599210 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0127 15:47:31.606229 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0127 15:47:31.606259 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0127 15:47:31.606281 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0127 15:47:31.606289 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0127 15:47:31.612903 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0127 15:47:31.612929 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0127 15:47:31.612934 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nI0127 15:47:31.612933 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0127 15:47:31.612939 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0127 15:47:31.612944 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0127 15:47:31.612947 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0127 15:47:31.612950 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0127 15:47:31.618002 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-27T15:47:15Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-27T15:47:14Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-27T15:47:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-27T15:47:13Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.721245 4713 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-27T15:47:31Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-27T15:48:45Z is after 2025-08-24T17:21:41Z" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.779312 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.779358 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.779367 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.779383 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.779393 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:45Z","lastTransitionTime":"2026-01-27T15:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.882606 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.882654 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.882668 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.882688 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.882704 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:45Z","lastTransitionTime":"2026-01-27T15:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.898699 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.898758 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.898771 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.898713 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:45 crc kubenswrapper[4713]: E0127 15:48:45.898895 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:45 crc kubenswrapper[4713]: E0127 15:48:45.899004 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:45 crc kubenswrapper[4713]: E0127 15:48:45.899127 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:45 crc kubenswrapper[4713]: E0127 15:48:45.899183 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.914778 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 13:33:37.202626774 +0000 UTC Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.986141 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.986556 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.986687 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.986835 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:45 crc kubenswrapper[4713]: I0127 15:48:45.986972 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:45Z","lastTransitionTime":"2026-01-27T15:48:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.090602 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.090645 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.090657 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.090677 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.090690 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:46Z","lastTransitionTime":"2026-01-27T15:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.194511 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.194582 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.194600 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.194626 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.194644 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:46Z","lastTransitionTime":"2026-01-27T15:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.297352 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.297438 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.297451 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.297474 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.297487 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:46Z","lastTransitionTime":"2026-01-27T15:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.400730 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.400796 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.400808 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.400830 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.400842 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:46Z","lastTransitionTime":"2026-01-27T15:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.504150 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.504406 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.504429 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.504463 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.504484 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:46Z","lastTransitionTime":"2026-01-27T15:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.607376 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.607441 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.607456 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.607484 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.607499 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:46Z","lastTransitionTime":"2026-01-27T15:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.710397 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.710473 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.710494 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.710517 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.710532 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:46Z","lastTransitionTime":"2026-01-27T15:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.813909 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.813966 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.813977 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.813997 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.814008 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:46Z","lastTransitionTime":"2026-01-27T15:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.915384 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 21:52:25.373507045 +0000 UTC Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.917667 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.917724 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.917734 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.917755 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:46 crc kubenswrapper[4713]: I0127 15:48:46.917768 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:46Z","lastTransitionTime":"2026-01-27T15:48:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.020715 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.020758 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.020769 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.020787 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.020799 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:47Z","lastTransitionTime":"2026-01-27T15:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.124062 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.124104 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.124114 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.124129 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.124140 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:47Z","lastTransitionTime":"2026-01-27T15:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.226911 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.226984 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.226992 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.227014 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.227025 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:47Z","lastTransitionTime":"2026-01-27T15:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.330468 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.330529 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.330542 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.330564 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.330580 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:47Z","lastTransitionTime":"2026-01-27T15:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.433241 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.433291 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.433302 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.433322 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.433335 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:47Z","lastTransitionTime":"2026-01-27T15:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.536092 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.536170 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.536180 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.536201 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.536215 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:47Z","lastTransitionTime":"2026-01-27T15:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.639272 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.639340 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.639354 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.639371 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.639381 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:47Z","lastTransitionTime":"2026-01-27T15:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.742027 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.742082 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.742101 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.742117 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.742127 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:47Z","lastTransitionTime":"2026-01-27T15:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.844616 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.844683 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.844700 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.844718 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.844730 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:47Z","lastTransitionTime":"2026-01-27T15:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.899432 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.899467 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.899476 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.899432 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:47 crc kubenswrapper[4713]: E0127 15:48:47.899612 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:47 crc kubenswrapper[4713]: E0127 15:48:47.899696 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:47 crc kubenswrapper[4713]: E0127 15:48:47.899778 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:47 crc kubenswrapper[4713]: E0127 15:48:47.899838 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.915609 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 04:48:51.943384797 +0000 UTC Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.947642 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.947690 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.947702 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.947723 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:47 crc kubenswrapper[4713]: I0127 15:48:47.947736 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:47Z","lastTransitionTime":"2026-01-27T15:48:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.051176 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.051230 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.051242 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.051263 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.051277 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:48Z","lastTransitionTime":"2026-01-27T15:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.154714 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.154767 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.154778 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.154800 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.154817 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:48Z","lastTransitionTime":"2026-01-27T15:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.257835 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.257881 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.257894 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.257913 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.257927 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:48Z","lastTransitionTime":"2026-01-27T15:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.361521 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.361584 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.361598 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.361621 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.361648 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:48Z","lastTransitionTime":"2026-01-27T15:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.464972 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.465077 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.465097 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.465117 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.465129 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:48Z","lastTransitionTime":"2026-01-27T15:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.571474 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.571523 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.571534 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.571551 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.571563 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:48Z","lastTransitionTime":"2026-01-27T15:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.675336 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.675382 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.675394 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.675414 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.675424 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:48Z","lastTransitionTime":"2026-01-27T15:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.778842 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.778898 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.778909 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.778935 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.778948 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:48Z","lastTransitionTime":"2026-01-27T15:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.881916 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.881966 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.881973 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.881989 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.881998 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:48Z","lastTransitionTime":"2026-01-27T15:48:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:48 crc kubenswrapper[4713]: I0127 15:48:48.917266 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 17:00:04.711837825 +0000 UTC Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.426533 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.426574 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.426583 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.426601 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.426612 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:49Z","lastTransitionTime":"2026-01-27T15:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.529531 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.529584 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.529594 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.529611 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.529624 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:49Z","lastTransitionTime":"2026-01-27T15:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.632398 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.632455 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.632466 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.632482 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.632496 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:49Z","lastTransitionTime":"2026-01-27T15:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.735767 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.735818 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.735830 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.735853 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.735869 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:49Z","lastTransitionTime":"2026-01-27T15:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.838513 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.838572 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.838585 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.838605 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.838618 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:49Z","lastTransitionTime":"2026-01-27T15:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.899246 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.899271 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:49 crc kubenswrapper[4713]: E0127 15:48:49.899831 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:49 crc kubenswrapper[4713]: E0127 15:48:49.899891 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.899450 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.899280 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:49 crc kubenswrapper[4713]: E0127 15:48:49.900000 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:49 crc kubenswrapper[4713]: E0127 15:48:49.900205 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.917490 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 13:21:00.069532799 +0000 UTC Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.941091 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.941170 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.941186 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.941205 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:49 crc kubenswrapper[4713]: I0127 15:48:49.941218 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:49Z","lastTransitionTime":"2026-01-27T15:48:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.025198 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.025534 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.025630 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.025742 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.025833 4713 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-27T15:48:50Z","lastTransitionTime":"2026-01-27T15:48:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.074489 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w"] Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.075108 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.078484 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.079228 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.079622 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.081360 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.107719 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-4l8xv" podStartSLOduration=75.107695307 podStartE2EDuration="1m15.107695307s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:48:50.106966665 +0000 UTC m=+97.885176603" watchObservedRunningTime="2026-01-27 15:48:50.107695307 +0000 UTC m=+97.885905245" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.132589 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d76025e-2fbc-426b-af89-5c895fe67fd9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.132645 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8d76025e-2fbc-426b-af89-5c895fe67fd9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.132663 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d76025e-2fbc-426b-af89-5c895fe67fd9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.132722 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8d76025e-2fbc-426b-af89-5c895fe67fd9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.132748 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8d76025e-2fbc-426b-af89-5c895fe67fd9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.145344 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=74.145319187 podStartE2EDuration="1m14.145319187s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:48:50.144824122 +0000 UTC m=+97.923034070" watchObservedRunningTime="2026-01-27 15:48:50.145319187 +0000 UTC m=+97.923529125" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.187174 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-j5sgs" podStartSLOduration=75.187150351 podStartE2EDuration="1m15.187150351s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:48:50.185670147 +0000 UTC m=+97.963880085" watchObservedRunningTime="2026-01-27 15:48:50.187150351 +0000 UTC m=+97.965360309" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.200356 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podStartSLOduration=75.20033371 podStartE2EDuration="1m15.20033371s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:48:50.199868026 +0000 UTC m=+97.978077964" watchObservedRunningTime="2026-01-27 15:48:50.20033371 +0000 UTC m=+97.978543648" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.221166 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=43.221137363 podStartE2EDuration="43.221137363s" podCreationTimestamp="2026-01-27 15:48:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:48:50.220949428 +0000 UTC m=+97.999159376" watchObservedRunningTime="2026-01-27 15:48:50.221137363 +0000 UTC m=+97.999347311" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.233358 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8d76025e-2fbc-426b-af89-5c895fe67fd9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.233440 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d76025e-2fbc-426b-af89-5c895fe67fd9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.233475 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8d76025e-2fbc-426b-af89-5c895fe67fd9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.233498 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d76025e-2fbc-426b-af89-5c895fe67fd9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.233535 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8d76025e-2fbc-426b-af89-5c895fe67fd9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.233611 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/8d76025e-2fbc-426b-af89-5c895fe67fd9-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.233867 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/8d76025e-2fbc-426b-af89-5c895fe67fd9-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.235143 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/8d76025e-2fbc-426b-af89-5c895fe67fd9-service-ca\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.240477 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d76025e-2fbc-426b-af89-5c895fe67fd9-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.253448 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=14.253424586 podStartE2EDuration="14.253424586s" podCreationTimestamp="2026-01-27 15:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:48:50.23289916 +0000 UTC m=+98.011109138" watchObservedRunningTime="2026-01-27 15:48:50.253424586 +0000 UTC m=+98.031634524" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.259932 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8d76025e-2fbc-426b-af89-5c895fe67fd9-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-4cm5w\" (UID: \"8d76025e-2fbc-426b-af89-5c895fe67fd9\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.274824 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-gb57w" podStartSLOduration=74.274803506 podStartE2EDuration="1m14.274803506s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:48:50.274425335 +0000 UTC m=+98.052635283" watchObservedRunningTime="2026-01-27 15:48:50.274803506 +0000 UTC m=+98.053013444" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.314557 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-cqmfm" podStartSLOduration=74.314534769 podStartE2EDuration="1m14.314534769s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:48:50.313805067 +0000 UTC m=+98.092015045" watchObservedRunningTime="2026-01-27 15:48:50.314534769 +0000 UTC m=+98.092744707" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.329401 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=76.329375766 podStartE2EDuration="1m16.329375766s" podCreationTimestamp="2026-01-27 15:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:48:50.329312845 +0000 UTC m=+98.107522783" watchObservedRunningTime="2026-01-27 15:48:50.329375766 +0000 UTC m=+98.107585724" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.366489 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-n7wxq" podStartSLOduration=74.366466261 podStartE2EDuration="1m14.366466261s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:48:50.365909554 +0000 UTC m=+98.144119502" watchObservedRunningTime="2026-01-27 15:48:50.366466261 +0000 UTC m=+98.144676199" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.381822 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.381804313 podStartE2EDuration="1m18.381804313s" podCreationTimestamp="2026-01-27 15:47:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:48:50.38134902 +0000 UTC m=+98.159558968" watchObservedRunningTime="2026-01-27 15:48:50.381804313 +0000 UTC m=+98.160014241" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.390495 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.446664 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" event={"ID":"8d76025e-2fbc-426b-af89-5c895fe67fd9","Type":"ContainerStarted","Data":"2697ffe0cefe21549188e23670e424633c9ff4db3cf806461af0aa806583ec41"} Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.918508 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 21:44:12.827323524 +0000 UTC Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.918585 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 27 15:48:50 crc kubenswrapper[4713]: I0127 15:48:50.928837 4713 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 27 15:48:51 crc kubenswrapper[4713]: I0127 15:48:51.452086 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" event={"ID":"8d76025e-2fbc-426b-af89-5c895fe67fd9","Type":"ContainerStarted","Data":"e19d651f2bdfa13ce38c94ca6065c33f0d2e6e78a9734b5f79e0de6680dab881"} Jan 27 15:48:51 crc kubenswrapper[4713]: I0127 15:48:51.899555 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:51 crc kubenswrapper[4713]: I0127 15:48:51.899676 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:51 crc kubenswrapper[4713]: E0127 15:48:51.899744 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:51 crc kubenswrapper[4713]: I0127 15:48:51.899572 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:51 crc kubenswrapper[4713]: I0127 15:48:51.899590 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:51 crc kubenswrapper[4713]: E0127 15:48:51.899856 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:51 crc kubenswrapper[4713]: E0127 15:48:51.900060 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:51 crc kubenswrapper[4713]: E0127 15:48:51.900160 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:53 crc kubenswrapper[4713]: I0127 15:48:53.899360 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:53 crc kubenswrapper[4713]: I0127 15:48:53.899437 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:53 crc kubenswrapper[4713]: I0127 15:48:53.899443 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:53 crc kubenswrapper[4713]: I0127 15:48:53.899554 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:53 crc kubenswrapper[4713]: E0127 15:48:53.899552 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:53 crc kubenswrapper[4713]: E0127 15:48:53.899608 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:53 crc kubenswrapper[4713]: E0127 15:48:53.899650 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:53 crc kubenswrapper[4713]: E0127 15:48:53.899709 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:54 crc kubenswrapper[4713]: I0127 15:48:54.373784 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs\") pod \"network-metrics-daemon-mdw5k\" (UID: \"81e4c47d-af8d-44f0-beff-17cf5f133ff7\") " pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:54 crc kubenswrapper[4713]: E0127 15:48:54.373942 4713 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:48:54 crc kubenswrapper[4713]: E0127 15:48:54.374025 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs podName:81e4c47d-af8d-44f0-beff-17cf5f133ff7 nodeName:}" failed. No retries permitted until 2026-01-27 15:49:58.3740007 +0000 UTC m=+166.152210638 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs") pod "network-metrics-daemon-mdw5k" (UID: "81e4c47d-af8d-44f0-beff-17cf5f133ff7") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 27 15:48:55 crc kubenswrapper[4713]: I0127 15:48:55.899267 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:55 crc kubenswrapper[4713]: E0127 15:48:55.900171 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:55 crc kubenswrapper[4713]: I0127 15:48:55.899273 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:55 crc kubenswrapper[4713]: I0127 15:48:55.899273 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:55 crc kubenswrapper[4713]: I0127 15:48:55.899307 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:55 crc kubenswrapper[4713]: E0127 15:48:55.900679 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:55 crc kubenswrapper[4713]: E0127 15:48:55.901231 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:55 crc kubenswrapper[4713]: E0127 15:48:55.901125 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:57 crc kubenswrapper[4713]: I0127 15:48:57.898867 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:57 crc kubenswrapper[4713]: I0127 15:48:57.898858 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:57 crc kubenswrapper[4713]: E0127 15:48:57.899731 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:57 crc kubenswrapper[4713]: I0127 15:48:57.898909 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:57 crc kubenswrapper[4713]: E0127 15:48:57.899842 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:57 crc kubenswrapper[4713]: E0127 15:48:57.899611 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:48:57 crc kubenswrapper[4713]: I0127 15:48:57.899872 4713 scope.go:117] "RemoveContainer" containerID="e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10" Jan 27 15:48:57 crc kubenswrapper[4713]: I0127 15:48:57.898943 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:57 crc kubenswrapper[4713]: E0127 15:48:57.900193 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" Jan 27 15:48:57 crc kubenswrapper[4713]: E0127 15:48:57.900230 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:59 crc kubenswrapper[4713]: I0127 15:48:59.899354 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:48:59 crc kubenswrapper[4713]: I0127 15:48:59.899471 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:48:59 crc kubenswrapper[4713]: I0127 15:48:59.899464 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:48:59 crc kubenswrapper[4713]: E0127 15:48:59.899608 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:48:59 crc kubenswrapper[4713]: I0127 15:48:59.899651 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:48:59 crc kubenswrapper[4713]: E0127 15:48:59.899846 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:48:59 crc kubenswrapper[4713]: E0127 15:48:59.899904 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:48:59 crc kubenswrapper[4713]: E0127 15:48:59.899973 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:01 crc kubenswrapper[4713]: I0127 15:49:01.899011 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:01 crc kubenswrapper[4713]: I0127 15:49:01.899165 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:01 crc kubenswrapper[4713]: E0127 15:49:01.899220 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:49:01 crc kubenswrapper[4713]: I0127 15:49:01.899283 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:01 crc kubenswrapper[4713]: I0127 15:49:01.899456 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:01 crc kubenswrapper[4713]: E0127 15:49:01.899446 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:49:01 crc kubenswrapper[4713]: E0127 15:49:01.899698 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:01 crc kubenswrapper[4713]: E0127 15:49:01.899923 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:49:03 crc kubenswrapper[4713]: I0127 15:49:03.899542 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:03 crc kubenswrapper[4713]: I0127 15:49:03.899628 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:03 crc kubenswrapper[4713]: I0127 15:49:03.899717 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:03 crc kubenswrapper[4713]: E0127 15:49:03.900164 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:49:03 crc kubenswrapper[4713]: I0127 15:49:03.900208 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:03 crc kubenswrapper[4713]: E0127 15:49:03.900351 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:03 crc kubenswrapper[4713]: E0127 15:49:03.900472 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:49:03 crc kubenswrapper[4713]: E0127 15:49:03.900570 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:49:05 crc kubenswrapper[4713]: I0127 15:49:05.898582 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:05 crc kubenswrapper[4713]: I0127 15:49:05.898662 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:05 crc kubenswrapper[4713]: I0127 15:49:05.898600 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:05 crc kubenswrapper[4713]: E0127 15:49:05.898835 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:49:05 crc kubenswrapper[4713]: E0127 15:49:05.898968 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:05 crc kubenswrapper[4713]: I0127 15:49:05.899012 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:05 crc kubenswrapper[4713]: E0127 15:49:05.899101 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:49:05 crc kubenswrapper[4713]: E0127 15:49:05.899202 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:49:07 crc kubenswrapper[4713]: I0127 15:49:07.899355 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:07 crc kubenswrapper[4713]: I0127 15:49:07.899443 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:07 crc kubenswrapper[4713]: I0127 15:49:07.899526 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:07 crc kubenswrapper[4713]: E0127 15:49:07.899518 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:49:07 crc kubenswrapper[4713]: I0127 15:49:07.899732 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:07 crc kubenswrapper[4713]: E0127 15:49:07.899716 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:07 crc kubenswrapper[4713]: E0127 15:49:07.899797 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:49:07 crc kubenswrapper[4713]: E0127 15:49:07.899911 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:49:08 crc kubenswrapper[4713]: I0127 15:49:08.900140 4713 scope.go:117] "RemoveContainer" containerID="e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10" Jan 27 15:49:08 crc kubenswrapper[4713]: E0127 15:49:08.900493 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-xs9tk_openshift-ovn-kubernetes(131f5d56-4900-4558-abfa-24c9e999e5ad)\"" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" Jan 27 15:49:09 crc kubenswrapper[4713]: I0127 15:49:09.899009 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:09 crc kubenswrapper[4713]: E0127 15:49:09.899623 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:09 crc kubenswrapper[4713]: I0127 15:49:09.899211 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:09 crc kubenswrapper[4713]: E0127 15:49:09.899748 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:49:09 crc kubenswrapper[4713]: I0127 15:49:09.899362 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:09 crc kubenswrapper[4713]: I0127 15:49:09.899150 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:09 crc kubenswrapper[4713]: E0127 15:49:09.899867 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:49:09 crc kubenswrapper[4713]: E0127 15:49:09.899955 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:49:10 crc kubenswrapper[4713]: I0127 15:49:10.515912 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n7wxq_bf07c585-f90e-4416-a66c-d41547008320/kube-multus/1.log" Jan 27 15:49:10 crc kubenswrapper[4713]: I0127 15:49:10.517075 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n7wxq_bf07c585-f90e-4416-a66c-d41547008320/kube-multus/0.log" Jan 27 15:49:10 crc kubenswrapper[4713]: I0127 15:49:10.517159 4713 generic.go:334] "Generic (PLEG): container finished" podID="bf07c585-f90e-4416-a66c-d41547008320" containerID="30964991c715fef1f0e7ee97a2a0bbbab1a671ddbd347113c96a318b76d91bfd" exitCode=1 Jan 27 15:49:10 crc kubenswrapper[4713]: I0127 15:49:10.517194 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n7wxq" event={"ID":"bf07c585-f90e-4416-a66c-d41547008320","Type":"ContainerDied","Data":"30964991c715fef1f0e7ee97a2a0bbbab1a671ddbd347113c96a318b76d91bfd"} Jan 27 15:49:10 crc kubenswrapper[4713]: I0127 15:49:10.517231 4713 scope.go:117] "RemoveContainer" containerID="d1c82c8ab5af0ed0b8d8a156ce812cb78bdebd85112a93239358a7a33c15472e" Jan 27 15:49:10 crc kubenswrapper[4713]: I0127 15:49:10.518191 4713 scope.go:117] "RemoveContainer" containerID="30964991c715fef1f0e7ee97a2a0bbbab1a671ddbd347113c96a318b76d91bfd" Jan 27 15:49:10 crc kubenswrapper[4713]: E0127 15:49:10.518574 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-n7wxq_openshift-multus(bf07c585-f90e-4416-a66c-d41547008320)\"" pod="openshift-multus/multus-n7wxq" podUID="bf07c585-f90e-4416-a66c-d41547008320" Jan 27 15:49:10 crc kubenswrapper[4713]: I0127 15:49:10.540741 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-4cm5w" podStartSLOduration=95.540710001 podStartE2EDuration="1m35.540710001s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:48:51.4657862 +0000 UTC m=+99.243996148" watchObservedRunningTime="2026-01-27 15:49:10.540710001 +0000 UTC m=+118.318919949" Jan 27 15:49:11 crc kubenswrapper[4713]: I0127 15:49:11.523199 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n7wxq_bf07c585-f90e-4416-a66c-d41547008320/kube-multus/1.log" Jan 27 15:49:11 crc kubenswrapper[4713]: I0127 15:49:11.898980 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:11 crc kubenswrapper[4713]: I0127 15:49:11.899125 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:11 crc kubenswrapper[4713]: E0127 15:49:11.899177 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:49:11 crc kubenswrapper[4713]: I0127 15:49:11.899221 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:11 crc kubenswrapper[4713]: I0127 15:49:11.899234 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:11 crc kubenswrapper[4713]: E0127 15:49:11.899331 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:49:11 crc kubenswrapper[4713]: E0127 15:49:11.899446 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:11 crc kubenswrapper[4713]: E0127 15:49:11.899521 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:49:12 crc kubenswrapper[4713]: E0127 15:49:12.901194 4713 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 27 15:49:13 crc kubenswrapper[4713]: E0127 15:49:13.007872 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 15:49:13 crc kubenswrapper[4713]: I0127 15:49:13.899522 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:13 crc kubenswrapper[4713]: I0127 15:49:13.899636 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:13 crc kubenswrapper[4713]: I0127 15:49:13.899688 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:13 crc kubenswrapper[4713]: I0127 15:49:13.899853 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:13 crc kubenswrapper[4713]: E0127 15:49:13.899840 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:49:13 crc kubenswrapper[4713]: E0127 15:49:13.899990 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:49:13 crc kubenswrapper[4713]: E0127 15:49:13.900087 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:13 crc kubenswrapper[4713]: E0127 15:49:13.900241 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:49:15 crc kubenswrapper[4713]: I0127 15:49:15.898861 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:15 crc kubenswrapper[4713]: I0127 15:49:15.898972 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:15 crc kubenswrapper[4713]: I0127 15:49:15.899116 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:15 crc kubenswrapper[4713]: E0127 15:49:15.899110 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:49:15 crc kubenswrapper[4713]: I0127 15:49:15.899145 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:15 crc kubenswrapper[4713]: E0127 15:49:15.899266 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:15 crc kubenswrapper[4713]: E0127 15:49:15.899388 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:49:15 crc kubenswrapper[4713]: E0127 15:49:15.899467 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:49:17 crc kubenswrapper[4713]: I0127 15:49:17.899121 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:17 crc kubenswrapper[4713]: I0127 15:49:17.899121 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:17 crc kubenswrapper[4713]: E0127 15:49:17.899306 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:49:17 crc kubenswrapper[4713]: E0127 15:49:17.899323 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:49:17 crc kubenswrapper[4713]: I0127 15:49:17.899244 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:17 crc kubenswrapper[4713]: I0127 15:49:17.899149 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:17 crc kubenswrapper[4713]: E0127 15:49:17.899426 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:49:17 crc kubenswrapper[4713]: E0127 15:49:17.899650 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:18 crc kubenswrapper[4713]: E0127 15:49:18.009156 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 15:49:19 crc kubenswrapper[4713]: I0127 15:49:19.898838 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:19 crc kubenswrapper[4713]: I0127 15:49:19.898892 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:19 crc kubenswrapper[4713]: I0127 15:49:19.898838 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:19 crc kubenswrapper[4713]: E0127 15:49:19.898997 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:49:19 crc kubenswrapper[4713]: I0127 15:49:19.899087 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:19 crc kubenswrapper[4713]: E0127 15:49:19.899150 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:49:19 crc kubenswrapper[4713]: E0127 15:49:19.899251 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:19 crc kubenswrapper[4713]: E0127 15:49:19.899359 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:49:21 crc kubenswrapper[4713]: I0127 15:49:21.899399 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:21 crc kubenswrapper[4713]: I0127 15:49:21.899399 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:21 crc kubenswrapper[4713]: E0127 15:49:21.899584 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:49:21 crc kubenswrapper[4713]: I0127 15:49:21.899439 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:21 crc kubenswrapper[4713]: I0127 15:49:21.899722 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:21 crc kubenswrapper[4713]: E0127 15:49:21.899889 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:49:21 crc kubenswrapper[4713]: E0127 15:49:21.899934 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:49:21 crc kubenswrapper[4713]: E0127 15:49:21.900005 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:23 crc kubenswrapper[4713]: E0127 15:49:23.010180 4713 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 27 15:49:23 crc kubenswrapper[4713]: I0127 15:49:23.899581 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:23 crc kubenswrapper[4713]: I0127 15:49:23.899613 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:23 crc kubenswrapper[4713]: I0127 15:49:23.899679 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:23 crc kubenswrapper[4713]: I0127 15:49:23.899915 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:23 crc kubenswrapper[4713]: E0127 15:49:23.899873 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:23 crc kubenswrapper[4713]: E0127 15:49:23.900083 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:49:23 crc kubenswrapper[4713]: E0127 15:49:23.900191 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:49:23 crc kubenswrapper[4713]: E0127 15:49:23.900896 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:49:23 crc kubenswrapper[4713]: I0127 15:49:23.901613 4713 scope.go:117] "RemoveContainer" containerID="e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10" Jan 27 15:49:24 crc kubenswrapper[4713]: I0127 15:49:24.570167 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/3.log" Jan 27 15:49:24 crc kubenswrapper[4713]: I0127 15:49:24.574581 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerStarted","Data":"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953"} Jan 27 15:49:24 crc kubenswrapper[4713]: I0127 15:49:24.575238 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:49:24 crc kubenswrapper[4713]: I0127 15:49:24.605063 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podStartSLOduration=108.605013362 podStartE2EDuration="1m48.605013362s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:24.604219278 +0000 UTC m=+132.382429226" watchObservedRunningTime="2026-01-27 15:49:24.605013362 +0000 UTC m=+132.383223300" Jan 27 15:49:24 crc kubenswrapper[4713]: I0127 15:49:24.801487 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mdw5k"] Jan 27 15:49:24 crc kubenswrapper[4713]: I0127 15:49:24.801695 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:24 crc kubenswrapper[4713]: E0127 15:49:24.802075 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:49:25 crc kubenswrapper[4713]: I0127 15:49:25.899290 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:25 crc kubenswrapper[4713]: I0127 15:49:25.899470 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:25 crc kubenswrapper[4713]: I0127 15:49:25.899487 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:25 crc kubenswrapper[4713]: E0127 15:49:25.899656 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:49:25 crc kubenswrapper[4713]: I0127 15:49:25.899706 4713 scope.go:117] "RemoveContainer" containerID="30964991c715fef1f0e7ee97a2a0bbbab1a671ddbd347113c96a318b76d91bfd" Jan 27 15:49:25 crc kubenswrapper[4713]: E0127 15:49:25.899759 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:25 crc kubenswrapper[4713]: E0127 15:49:25.899966 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:49:26 crc kubenswrapper[4713]: I0127 15:49:26.584357 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n7wxq_bf07c585-f90e-4416-a66c-d41547008320/kube-multus/1.log" Jan 27 15:49:26 crc kubenswrapper[4713]: I0127 15:49:26.584885 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n7wxq" event={"ID":"bf07c585-f90e-4416-a66c-d41547008320","Type":"ContainerStarted","Data":"8d44969cb30e758dfd5d789635414864d3e62f8dee75ba64ba6c62a4e8a4600a"} Jan 27 15:49:26 crc kubenswrapper[4713]: I0127 15:49:26.899434 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:26 crc kubenswrapper[4713]: E0127 15:49:26.899952 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-mdw5k" podUID="81e4c47d-af8d-44f0-beff-17cf5f133ff7" Jan 27 15:49:27 crc kubenswrapper[4713]: I0127 15:49:27.899539 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:27 crc kubenswrapper[4713]: I0127 15:49:27.899624 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:27 crc kubenswrapper[4713]: E0127 15:49:27.899769 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 27 15:49:27 crc kubenswrapper[4713]: E0127 15:49:27.899849 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 27 15:49:27 crc kubenswrapper[4713]: I0127 15:49:27.899949 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:27 crc kubenswrapper[4713]: E0127 15:49:27.900016 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 27 15:49:28 crc kubenswrapper[4713]: I0127 15:49:28.899559 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:28 crc kubenswrapper[4713]: I0127 15:49:28.901786 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 27 15:49:28 crc kubenswrapper[4713]: I0127 15:49:28.902103 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 27 15:49:29 crc kubenswrapper[4713]: I0127 15:49:29.898820 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:29 crc kubenswrapper[4713]: I0127 15:49:29.898910 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:29 crc kubenswrapper[4713]: I0127 15:49:29.898814 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:29 crc kubenswrapper[4713]: I0127 15:49:29.901770 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 27 15:49:29 crc kubenswrapper[4713]: I0127 15:49:29.901785 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 27 15:49:29 crc kubenswrapper[4713]: I0127 15:49:29.901847 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 27 15:49:29 crc kubenswrapper[4713]: I0127 15:49:29.903098 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.031809 4713 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.086114 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5cl72"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.101063 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.101268 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.102916 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.103501 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-mkwg6"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.103527 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.103804 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mkwg6" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.104119 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.104168 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0905e52-1e55-4a6b-887a-6680bd4d2004-serving-cert\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.104233 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f0905e52-1e55-4a6b-887a-6680bd4d2004-encryption-config\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.104337 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f0905e52-1e55-4a6b-887a-6680bd4d2004-node-pullsecrets\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.104411 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0905e52-1e55-4a6b-887a-6680bd4d2004-audit-dir\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.104477 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-etcd-serving-ca\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.104528 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-config\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.104569 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jrkc\" (UniqueName: \"kubernetes.io/projected/f0905e52-1e55-4a6b-887a-6680bd4d2004-kube-api-access-6jrkc\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.104592 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-image-import-ca\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.104753 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-audit\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.104788 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f0905e52-1e55-4a6b-887a-6680bd4d2004-etcd-client\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.104877 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.106811 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t92fx"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.107733 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.109121 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.109445 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.111867 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.112115 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.112574 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.112588 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.112767 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.113063 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.113154 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.113270 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.113314 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.113418 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.113455 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.113643 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.113708 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.113789 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.113890 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.113970 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.114100 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.114210 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.114287 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.114312 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.117638 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.117659 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.117681 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.117952 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.117979 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.118117 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.118224 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.118323 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.118632 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.121757 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.122530 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.122949 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.123769 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.124030 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4sxh8"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.124745 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.130868 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.130868 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.132220 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wrjvf"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.132940 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.138698 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.142197 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.142930 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.154377 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.155933 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.156116 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.156142 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.160446 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7vcrd"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.187223 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.190003 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.190311 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.190458 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.190610 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.190780 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.191169 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.191435 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.191532 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.191805 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.192831 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.193053 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.193168 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.193282 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.193387 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.193486 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.193584 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.194692 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.194779 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.196424 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.196533 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.196616 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.196684 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.196838 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.196934 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.197017 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.197139 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.197230 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.197771 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.197987 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.206608 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.206988 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.206999 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tfz52"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.207571 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.207780 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.207990 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.208129 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-xffcl"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.208200 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.208413 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.208615 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.208653 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.208786 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.209324 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.211272 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mg8jq"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.211727 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mg8jq" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.212096 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.216684 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-config\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.216738 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.216764 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grm2d\" (UniqueName: \"kubernetes.io/projected/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-kube-api-access-grm2d\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.216790 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-image-import-ca\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.216814 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt9m5\" (UniqueName: \"kubernetes.io/projected/e411ddc4-0c8a-4cae-b08d-264bae41dffc-kube-api-access-kt9m5\") pod \"machine-api-operator-5694c8668f-4sxh8\" (UID: \"e411ddc4-0c8a-4cae-b08d-264bae41dffc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.216849 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/520dd40a-8e43-4dc0-abf7-2c2b4873b1ab-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5rbsl\" (UID: \"520dd40a-8e43-4dc0-abf7-2c2b4873b1ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.216873 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.216895 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f0905e52-1e55-4a6b-887a-6680bd4d2004-etcd-client\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.216916 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-config\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.216933 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-client-ca\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.216952 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.216972 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.216997 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.217018 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b935bba-4a22-4c28-b629-367f4a43bbc0-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.218029 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-config\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.218932 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-image-import-ca\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.217072 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef05c38c-a0a0-40b6-9f4c-351768f9b547-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zppll\" (UID: \"ef05c38c-a0a0-40b6-9f4c-351768f9b547\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219455 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b935bba-4a22-4c28-b629-367f4a43bbc0-config\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219499 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aae489c7-6a71-4465-9c25-7d27eb68b318-client-ca\") pod \"route-controller-manager-6576b87f9c-h5jwm\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219524 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtzfz\" (UniqueName: \"kubernetes.io/projected/ad390cc5-3b01-4343-97a2-2c4385fe5142-kube-api-access-rtzfz\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219554 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219594 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0905e52-1e55-4a6b-887a-6680bd4d2004-serving-cert\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219619 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c64795e-faf7-45f0-8256-2ae8191e4e07-config\") pod \"machine-approver-56656f9798-gcqw5\" (UID: \"4c64795e-faf7-45f0-8256-2ae8191e4e07\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219646 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/520dd40a-8e43-4dc0-abf7-2c2b4873b1ab-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5rbsl\" (UID: \"520dd40a-8e43-4dc0-abf7-2c2b4873b1ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219667 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad390cc5-3b01-4343-97a2-2c4385fe5142-serving-cert\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219700 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c464a7a2-0836-47e1-818d-aa8aaed4e657-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h5w6b\" (UID: \"c464a7a2-0836-47e1-818d-aa8aaed4e657\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219723 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e411ddc4-0c8a-4cae-b08d-264bae41dffc-config\") pod \"machine-api-operator-5694c8668f-4sxh8\" (UID: \"e411ddc4-0c8a-4cae-b08d-264bae41dffc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219743 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219767 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0905e52-1e55-4a6b-887a-6680bd4d2004-audit-dir\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219791 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwtzh\" (UniqueName: \"kubernetes.io/projected/520dd40a-8e43-4dc0-abf7-2c2b4873b1ab-kube-api-access-vwtzh\") pod \"openshift-apiserver-operator-796bbdcf4f-5rbsl\" (UID: \"520dd40a-8e43-4dc0-abf7-2c2b4873b1ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219819 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-etcd-serving-ca\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219847 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jrkc\" (UniqueName: \"kubernetes.io/projected/f0905e52-1e55-4a6b-887a-6680bd4d2004-kube-api-access-6jrkc\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219876 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-628z8\" (UniqueName: \"kubernetes.io/projected/01f5e6a0-654f-41cb-a694-2c86ca8523d8-kube-api-access-628z8\") pod \"downloads-7954f5f757-mkwg6\" (UID: \"01f5e6a0-654f-41cb-a694-2c86ca8523d8\") " pod="openshift-console/downloads-7954f5f757-mkwg6" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219908 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c64795e-faf7-45f0-8256-2ae8191e4e07-auth-proxy-config\") pod \"machine-approver-56656f9798-gcqw5\" (UID: \"4c64795e-faf7-45f0-8256-2ae8191e4e07\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219937 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b935bba-4a22-4c28-b629-367f4a43bbc0-serving-cert\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219959 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae489c7-6a71-4465-9c25-7d27eb68b318-config\") pod \"route-controller-manager-6576b87f9c-h5jwm\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.219998 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-audit\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220021 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9n45\" (UniqueName: \"kubernetes.io/projected/c464a7a2-0836-47e1-818d-aa8aaed4e657-kube-api-access-n9n45\") pod \"cluster-image-registry-operator-dc59b4c8b-h5w6b\" (UID: \"c464a7a2-0836-47e1-818d-aa8aaed4e657\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220064 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220093 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220116 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b935bba-4a22-4c28-b629-367f4a43bbc0-service-ca-bundle\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220243 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c464a7a2-0836-47e1-818d-aa8aaed4e657-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h5w6b\" (UID: \"c464a7a2-0836-47e1-818d-aa8aaed4e657\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220277 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220327 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220405 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6r4t\" (UniqueName: \"kubernetes.io/projected/aae489c7-6a71-4465-9c25-7d27eb68b318-kube-api-access-k6r4t\") pod \"route-controller-manager-6576b87f9c-h5jwm\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220434 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjn9c\" (UniqueName: \"kubernetes.io/projected/ef05c38c-a0a0-40b6-9f4c-351768f9b547-kube-api-access-jjn9c\") pod \"cluster-samples-operator-665b6dd947-zppll\" (UID: \"ef05c38c-a0a0-40b6-9f4c-351768f9b547\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220456 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-audit-policies\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220507 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220528 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aae489c7-6a71-4465-9c25-7d27eb68b318-serving-cert\") pod \"route-controller-manager-6576b87f9c-h5jwm\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220572 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f0905e52-1e55-4a6b-887a-6680bd4d2004-encryption-config\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220602 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-audit-dir\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220625 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f0905e52-1e55-4a6b-887a-6680bd4d2004-node-pullsecrets\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220649 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4c64795e-faf7-45f0-8256-2ae8191e4e07-machine-approver-tls\") pod \"machine-approver-56656f9798-gcqw5\" (UID: \"4c64795e-faf7-45f0-8256-2ae8191e4e07\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220674 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjp74\" (UniqueName: \"kubernetes.io/projected/4c64795e-faf7-45f0-8256-2ae8191e4e07-kube-api-access-cjp74\") pod \"machine-approver-56656f9798-gcqw5\" (UID: \"4c64795e-faf7-45f0-8256-2ae8191e4e07\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220701 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e411ddc4-0c8a-4cae-b08d-264bae41dffc-images\") pod \"machine-api-operator-5694c8668f-4sxh8\" (UID: \"e411ddc4-0c8a-4cae-b08d-264bae41dffc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220726 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e411ddc4-0c8a-4cae-b08d-264bae41dffc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4sxh8\" (UID: \"e411ddc4-0c8a-4cae-b08d-264bae41dffc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220752 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220781 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c464a7a2-0836-47e1-818d-aa8aaed4e657-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h5w6b\" (UID: \"c464a7a2-0836-47e1-818d-aa8aaed4e657\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220810 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rc4t\" (UniqueName: \"kubernetes.io/projected/4b935bba-4a22-4c28-b629-367f4a43bbc0-kube-api-access-8rc4t\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.220935 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f0905e52-1e55-4a6b-887a-6680bd4d2004-audit-dir\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.221864 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-etcd-serving-ca\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.221928 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-trusted-ca-bundle\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.222198 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jjr22"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.223134 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f0905e52-1e55-4a6b-887a-6680bd4d2004-node-pullsecrets\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.223267 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f0905e52-1e55-4a6b-887a-6680bd4d2004-audit\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.224002 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.224905 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.231060 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f0905e52-1e55-4a6b-887a-6680bd4d2004-etcd-client\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.233378 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f0905e52-1e55-4a6b-887a-6680bd4d2004-encryption-config\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.234486 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.234786 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.234914 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.234997 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.236265 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.236421 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.236531 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.236746 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.236849 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.236950 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.237074 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.237223 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.237248 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.237364 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.237417 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.237522 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.237618 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.237658 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.237773 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.237889 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.238846 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.236736 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0905e52-1e55-4a6b-887a-6680bd4d2004-serving-cert\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.239082 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.239774 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.241844 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.242019 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.243975 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.244456 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.252470 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.256101 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.259189 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.264833 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.265114 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.264715 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.265600 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.275002 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.275515 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.275814 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.276400 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.276900 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.277112 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jrkc\" (UniqueName: \"kubernetes.io/projected/f0905e52-1e55-4a6b-887a-6680bd4d2004-kube-api-access-6jrkc\") pod \"apiserver-76f77b778f-5cl72\" (UID: \"f0905e52-1e55-4a6b-887a-6680bd4d2004\") " pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.277482 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hlk8z"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.278066 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.278857 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-xdbd4"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.279722 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.282914 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.286103 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jqnj8"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.287585 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.287777 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-jqnj8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.288443 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.289647 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.289874 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rznpq"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.290280 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.290845 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rznpq" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.291178 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.291970 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.292432 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-72qz9"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.293165 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.293464 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.294086 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.294606 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.295312 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.295932 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.296494 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.297130 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pjb9r"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.297588 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-pjb9r" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.298328 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.298847 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.300260 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tqc9f"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.301368 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9vt68"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.301525 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqc9f" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.301877 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.302070 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.304626 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.305836 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mkwg6"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.305866 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5cl72"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.305968 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.306226 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.307406 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.308877 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7vcrd"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.309824 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vdwzh"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.313454 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321327 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9n45\" (UniqueName: \"kubernetes.io/projected/c464a7a2-0836-47e1-818d-aa8aaed4e657-kube-api-access-n9n45\") pod \"cluster-image-registry-operator-dc59b4c8b-h5w6b\" (UID: \"c464a7a2-0836-47e1-818d-aa8aaed4e657\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321394 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321425 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321469 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f57fea35-d5a7-43ca-843f-c8b81ca8bf69-serving-cert\") pod \"console-operator-58897d9998-tfz52\" (UID: \"f57fea35-d5a7-43ca-843f-c8b81ca8bf69\") " pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321516 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/77bed8c1-f317-4117-8122-2bae18d42813-srv-cert\") pod \"catalog-operator-68c6474976-vkk8k\" (UID: \"77bed8c1-f317-4117-8122-2bae18d42813\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321540 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b935bba-4a22-4c28-b629-367f4a43bbc0-service-ca-bundle\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321560 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a38fa71e-fb6f-4742-9291-0038d946cfec-trusted-ca-bundle\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321581 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c464a7a2-0836-47e1-818d-aa8aaed4e657-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h5w6b\" (UID: \"c464a7a2-0836-47e1-818d-aa8aaed4e657\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321601 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321619 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321640 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e114b26c-9613-4582-bfa4-d2ae0bc2e388-metrics-tls\") pod \"ingress-operator-5b745b69d9-kqg95\" (UID: \"e114b26c-9613-4582-bfa4-d2ae0bc2e388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321660 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4cm2\" (UniqueName: \"kubernetes.io/projected/175f6c4b-7354-47b4-b9d1-5867e001bb88-kube-api-access-d4cm2\") pod \"kube-storage-version-migrator-operator-b67b599dd-v744w\" (UID: \"175f6c4b-7354-47b4-b9d1-5867e001bb88\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321684 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd6mx\" (UniqueName: \"kubernetes.io/projected/7d587445-25fe-458a-8157-47d510ef09d4-kube-api-access-zd6mx\") pod \"multus-admission-controller-857f4d67dd-jqnj8\" (UID: \"7d587445-25fe-458a-8157-47d510ef09d4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jqnj8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321719 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6r4t\" (UniqueName: \"kubernetes.io/projected/aae489c7-6a71-4465-9c25-7d27eb68b318-kube-api-access-k6r4t\") pod \"route-controller-manager-6576b87f9c-h5jwm\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321741 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69jd2\" (UniqueName: \"kubernetes.io/projected/00b31185-ee2b-4482-99c1-f126457bd724-kube-api-access-69jd2\") pod \"packageserver-d55dfcdfc-qbcpm\" (UID: \"00b31185-ee2b-4482-99c1-f126457bd724\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321762 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a38fa71e-fb6f-4742-9291-0038d946cfec-console-config\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321786 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321808 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175f6c4b-7354-47b4-b9d1-5867e001bb88-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v744w\" (UID: \"175f6c4b-7354-47b4-b9d1-5867e001bb88\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321834 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjn9c\" (UniqueName: \"kubernetes.io/projected/ef05c38c-a0a0-40b6-9f4c-351768f9b547-kube-api-access-jjn9c\") pod \"cluster-samples-operator-665b6dd947-zppll\" (UID: \"ef05c38c-a0a0-40b6-9f4c-351768f9b547\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321909 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-audit-policies\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321936 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aae489c7-6a71-4465-9c25-7d27eb68b318-serving-cert\") pod \"route-controller-manager-6576b87f9c-h5jwm\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.321978 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxq4s\" (UniqueName: \"kubernetes.io/projected/86cad552-e907-4741-9709-e3952fcf470a-kube-api-access-xxq4s\") pod \"collect-profiles-29492145-r8sxx\" (UID: \"86cad552-e907-4741-9709-e3952fcf470a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322002 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49a98c2a-bf95-449c-8fee-d2b7ed4381db-metrics-certs\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322030 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4c64795e-faf7-45f0-8256-2ae8191e4e07-machine-approver-tls\") pod \"machine-approver-56656f9798-gcqw5\" (UID: \"4c64795e-faf7-45f0-8256-2ae8191e4e07\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322069 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjp74\" (UniqueName: \"kubernetes.io/projected/4c64795e-faf7-45f0-8256-2ae8191e4e07-kube-api-access-cjp74\") pod \"machine-approver-56656f9798-gcqw5\" (UID: \"4c64795e-faf7-45f0-8256-2ae8191e4e07\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322096 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e411ddc4-0c8a-4cae-b08d-264bae41dffc-images\") pod \"machine-api-operator-5694c8668f-4sxh8\" (UID: \"e411ddc4-0c8a-4cae-b08d-264bae41dffc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322121 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e411ddc4-0c8a-4cae-b08d-264bae41dffc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4sxh8\" (UID: \"e411ddc4-0c8a-4cae-b08d-264bae41dffc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322146 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-audit-dir\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322173 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322203 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b76ed054-9d3c-4679-ac70-d226ec5ec51a-serving-cert\") pod \"openshift-config-operator-7777fb866f-72qz9\" (UID: \"b76ed054-9d3c-4679-ac70-d226ec5ec51a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322226 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e114b26c-9613-4582-bfa4-d2ae0bc2e388-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kqg95\" (UID: \"e114b26c-9613-4582-bfa4-d2ae0bc2e388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322251 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c464a7a2-0836-47e1-818d-aa8aaed4e657-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h5w6b\" (UID: \"c464a7a2-0836-47e1-818d-aa8aaed4e657\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322277 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rc4t\" (UniqueName: \"kubernetes.io/projected/4b935bba-4a22-4c28-b629-367f4a43bbc0-kube-api-access-8rc4t\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322303 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c69cw\" (UniqueName: \"kubernetes.io/projected/b76ed054-9d3c-4679-ac70-d226ec5ec51a-kube-api-access-c69cw\") pod \"openshift-config-operator-7777fb866f-72qz9\" (UID: \"b76ed054-9d3c-4679-ac70-d226ec5ec51a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322346 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322371 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grm2d\" (UniqueName: \"kubernetes.io/projected/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-kube-api-access-grm2d\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322393 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b77a577-edba-4a31-9885-ae11f403595a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8jtx5\" (UID: \"5b77a577-edba-4a31-9885-ae11f403595a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322428 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a38fa71e-fb6f-4742-9291-0038d946cfec-console-oauth-config\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322453 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5ed86dba-52b7-4652-91c7-aea3c8def1fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hlk8z\" (UID: \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322475 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/49a98c2a-bf95-449c-8fee-d2b7ed4381db-default-certificate\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322499 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49a98c2a-bf95-449c-8fee-d2b7ed4381db-service-ca-bundle\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322524 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e114b26c-9613-4582-bfa4-d2ae0bc2e388-trusted-ca\") pod \"ingress-operator-5b745b69d9-kqg95\" (UID: \"e114b26c-9613-4582-bfa4-d2ae0bc2e388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322547 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/49a98c2a-bf95-449c-8fee-d2b7ed4381db-stats-auth\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322574 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt9m5\" (UniqueName: \"kubernetes.io/projected/e411ddc4-0c8a-4cae-b08d-264bae41dffc-kube-api-access-kt9m5\") pod \"machine-api-operator-5694c8668f-4sxh8\" (UID: \"e411ddc4-0c8a-4cae-b08d-264bae41dffc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322615 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/520dd40a-8e43-4dc0-abf7-2c2b4873b1ab-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5rbsl\" (UID: \"520dd40a-8e43-4dc0-abf7-2c2b4873b1ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322643 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00b31185-ee2b-4482-99c1-f126457bd724-webhook-cert\") pod \"packageserver-d55dfcdfc-qbcpm\" (UID: \"00b31185-ee2b-4482-99c1-f126457bd724\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322667 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a38fa71e-fb6f-4742-9291-0038d946cfec-service-ca\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322692 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzrdr\" (UniqueName: \"kubernetes.io/projected/f57fea35-d5a7-43ca-843f-c8b81ca8bf69-kube-api-access-fzrdr\") pod \"console-operator-58897d9998-tfz52\" (UID: \"f57fea35-d5a7-43ca-843f-c8b81ca8bf69\") " pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322717 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/15eba547-a66e-48be-9d4e-fb4262515042-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tgdd8\" (UID: \"15eba547-a66e-48be-9d4e-fb4262515042\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322761 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322788 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw7qs\" (UniqueName: \"kubernetes.io/projected/e114b26c-9613-4582-bfa4-d2ae0bc2e388-kube-api-access-sw7qs\") pod \"ingress-operator-5b745b69d9-kqg95\" (UID: \"e114b26c-9613-4582-bfa4-d2ae0bc2e388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322817 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-client-ca\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322845 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322870 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322895 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f57fea35-d5a7-43ca-843f-c8b81ca8bf69-trusted-ca\") pod \"console-operator-58897d9998-tfz52\" (UID: \"f57fea35-d5a7-43ca-843f-c8b81ca8bf69\") " pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322918 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b77a577-edba-4a31-9885-ae11f403595a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8jtx5\" (UID: \"5b77a577-edba-4a31-9885-ae11f403595a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322947 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-config\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322971 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86cad552-e907-4741-9709-e3952fcf470a-config-volume\") pod \"collect-profiles-29492145-r8sxx\" (UID: \"86cad552-e907-4741-9709-e3952fcf470a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.322996 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b935bba-4a22-4c28-b629-367f4a43bbc0-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323020 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00b31185-ee2b-4482-99c1-f126457bd724-apiservice-cert\") pod \"packageserver-d55dfcdfc-qbcpm\" (UID: \"00b31185-ee2b-4482-99c1-f126457bd724\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323066 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175f6c4b-7354-47b4-b9d1-5867e001bb88-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v744w\" (UID: \"175f6c4b-7354-47b4-b9d1-5867e001bb88\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323095 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323122 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef05c38c-a0a0-40b6-9f4c-351768f9b547-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zppll\" (UID: \"ef05c38c-a0a0-40b6-9f4c-351768f9b547\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323146 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b935bba-4a22-4c28-b629-367f4a43bbc0-config\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323196 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f57fea35-d5a7-43ca-843f-c8b81ca8bf69-config\") pod \"console-operator-58897d9998-tfz52\" (UID: \"f57fea35-d5a7-43ca-843f-c8b81ca8bf69\") " pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323220 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klw29\" (UniqueName: \"kubernetes.io/projected/7c0aa49a-697a-414c-94e5-fb0594c86d8b-kube-api-access-klw29\") pod \"dns-operator-744455d44c-mg8jq\" (UID: \"7c0aa49a-697a-414c-94e5-fb0594c86d8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-mg8jq" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323248 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/00b31185-ee2b-4482-99c1-f126457bd724-tmpfs\") pod \"packageserver-d55dfcdfc-qbcpm\" (UID: \"00b31185-ee2b-4482-99c1-f126457bd724\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323276 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtzfz\" (UniqueName: \"kubernetes.io/projected/ad390cc5-3b01-4343-97a2-2c4385fe5142-kube-api-access-rtzfz\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323300 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b77a577-edba-4a31-9885-ae11f403595a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8jtx5\" (UID: \"5b77a577-edba-4a31-9885-ae11f403595a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323325 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tx6jv\" (UniqueName: \"kubernetes.io/projected/15eba547-a66e-48be-9d4e-fb4262515042-kube-api-access-tx6jv\") pod \"machine-config-controller-84d6567774-tgdd8\" (UID: \"15eba547-a66e-48be-9d4e-fb4262515042\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323350 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbndq\" (UniqueName: \"kubernetes.io/projected/a38fa71e-fb6f-4742-9291-0038d946cfec-kube-api-access-rbndq\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323376 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aae489c7-6a71-4465-9c25-7d27eb68b318-client-ca\") pod \"route-controller-manager-6576b87f9c-h5jwm\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323400 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ed86dba-52b7-4652-91c7-aea3c8def1fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hlk8z\" (UID: \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323424 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc52l\" (UniqueName: \"kubernetes.io/projected/5ed86dba-52b7-4652-91c7-aea3c8def1fc-kube-api-access-xc52l\") pod \"marketplace-operator-79b997595-hlk8z\" (UID: \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323447 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86cad552-e907-4741-9709-e3952fcf470a-secret-volume\") pod \"collect-profiles-29492145-r8sxx\" (UID: \"86cad552-e907-4741-9709-e3952fcf470a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323472 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c64795e-faf7-45f0-8256-2ae8191e4e07-config\") pod \"machine-approver-56656f9798-gcqw5\" (UID: \"4c64795e-faf7-45f0-8256-2ae8191e4e07\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323497 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/520dd40a-8e43-4dc0-abf7-2c2b4873b1ab-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5rbsl\" (UID: \"520dd40a-8e43-4dc0-abf7-2c2b4873b1ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323522 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad390cc5-3b01-4343-97a2-2c4385fe5142-serving-cert\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323558 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c464a7a2-0836-47e1-818d-aa8aaed4e657-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h5w6b\" (UID: \"c464a7a2-0836-47e1-818d-aa8aaed4e657\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323580 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e411ddc4-0c8a-4cae-b08d-264bae41dffc-config\") pod \"machine-api-operator-5694c8668f-4sxh8\" (UID: \"e411ddc4-0c8a-4cae-b08d-264bae41dffc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323603 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323628 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwtzh\" (UniqueName: \"kubernetes.io/projected/520dd40a-8e43-4dc0-abf7-2c2b4873b1ab-kube-api-access-vwtzh\") pod \"openshift-apiserver-operator-796bbdcf4f-5rbsl\" (UID: \"520dd40a-8e43-4dc0-abf7-2c2b4873b1ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323651 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15eba547-a66e-48be-9d4e-fb4262515042-proxy-tls\") pod \"machine-config-controller-84d6567774-tgdd8\" (UID: \"15eba547-a66e-48be-9d4e-fb4262515042\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323675 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rszx6\" (UniqueName: \"kubernetes.io/projected/49a98c2a-bf95-449c-8fee-d2b7ed4381db-kube-api-access-rszx6\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323698 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a38fa71e-fb6f-4742-9291-0038d946cfec-console-serving-cert\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323726 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b76ed054-9d3c-4679-ac70-d226ec5ec51a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-72qz9\" (UID: \"b76ed054-9d3c-4679-ac70-d226ec5ec51a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323749 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c0aa49a-697a-414c-94e5-fb0594c86d8b-metrics-tls\") pod \"dns-operator-744455d44c-mg8jq\" (UID: \"7c0aa49a-697a-414c-94e5-fb0594c86d8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-mg8jq" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323782 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-628z8\" (UniqueName: \"kubernetes.io/projected/01f5e6a0-654f-41cb-a694-2c86ca8523d8-kube-api-access-628z8\") pod \"downloads-7954f5f757-mkwg6\" (UID: \"01f5e6a0-654f-41cb-a694-2c86ca8523d8\") " pod="openshift-console/downloads-7954f5f757-mkwg6" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.323808 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/77bed8c1-f317-4117-8122-2bae18d42813-profile-collector-cert\") pod \"catalog-operator-68c6474976-vkk8k\" (UID: \"77bed8c1-f317-4117-8122-2bae18d42813\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.324485 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmgcz\" (UniqueName: \"kubernetes.io/projected/77bed8c1-f317-4117-8122-2bae18d42813-kube-api-access-rmgcz\") pod \"catalog-operator-68c6474976-vkk8k\" (UID: \"77bed8c1-f317-4117-8122-2bae18d42813\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.324516 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7d587445-25fe-458a-8157-47d510ef09d4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jqnj8\" (UID: \"7d587445-25fe-458a-8157-47d510ef09d4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jqnj8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.324547 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c64795e-faf7-45f0-8256-2ae8191e4e07-auth-proxy-config\") pod \"machine-approver-56656f9798-gcqw5\" (UID: \"4c64795e-faf7-45f0-8256-2ae8191e4e07\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.324573 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b935bba-4a22-4c28-b629-367f4a43bbc0-serving-cert\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.324599 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qzc9\" (UniqueName: \"kubernetes.io/projected/b27ad314-02e7-4170-8921-a531184c006d-kube-api-access-7qzc9\") pod \"migrator-59844c95c7-tqc9f\" (UID: \"b27ad314-02e7-4170-8921-a531184c006d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqc9f" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.324627 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae489c7-6a71-4465-9c25-7d27eb68b318-config\") pod \"route-controller-manager-6576b87f9c-h5jwm\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.324652 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a38fa71e-fb6f-4742-9291-0038d946cfec-oauth-serving-cert\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.327187 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.327314 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.328190 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.329396 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.330165 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b935bba-4a22-4c28-b629-367f4a43bbc0-service-ca-bundle\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.331464 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.331580 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-client-ca\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.332959 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/4c64795e-faf7-45f0-8256-2ae8191e4e07-auth-proxy-config\") pod \"machine-approver-56656f9798-gcqw5\" (UID: \"4c64795e-faf7-45f0-8256-2ae8191e4e07\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.333923 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e411ddc4-0c8a-4cae-b08d-264bae41dffc-images\") pod \"machine-api-operator-5694c8668f-4sxh8\" (UID: \"e411ddc4-0c8a-4cae-b08d-264bae41dffc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.334441 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c64795e-faf7-45f0-8256-2ae8191e4e07-config\") pod \"machine-approver-56656f9798-gcqw5\" (UID: \"4c64795e-faf7-45f0-8256-2ae8191e4e07\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.334714 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-audit-dir\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.336009 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae489c7-6a71-4465-9c25-7d27eb68b318-config\") pod \"route-controller-manager-6576b87f9c-h5jwm\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.336345 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.336852 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/520dd40a-8e43-4dc0-abf7-2c2b4873b1ab-config\") pod \"openshift-apiserver-operator-796bbdcf4f-5rbsl\" (UID: \"520dd40a-8e43-4dc0-abf7-2c2b4873b1ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.337152 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-audit-policies\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.337912 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e411ddc4-0c8a-4cae-b08d-264bae41dffc-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-4sxh8\" (UID: \"e411ddc4-0c8a-4cae-b08d-264bae41dffc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.338313 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4b935bba-4a22-4c28-b629-367f4a43bbc0-config\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.339078 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e411ddc4-0c8a-4cae-b08d-264bae41dffc-config\") pod \"machine-api-operator-5694c8668f-4sxh8\" (UID: \"e411ddc4-0c8a-4cae-b08d-264bae41dffc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.339024 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c464a7a2-0836-47e1-818d-aa8aaed4e657-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-h5w6b\" (UID: \"c464a7a2-0836-47e1-818d-aa8aaed4e657\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.339641 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad390cc5-3b01-4343-97a2-2c4385fe5142-serving-cert\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.340461 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.341186 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aae489c7-6a71-4465-9c25-7d27eb68b318-serving-cert\") pod \"route-controller-manager-6576b87f9c-h5jwm\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.341235 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aae489c7-6a71-4465-9c25-7d27eb68b318-client-ca\") pod \"route-controller-manager-6576b87f9c-h5jwm\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.342137 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-config\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.342473 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4b935bba-4a22-4c28-b629-367f4a43bbc0-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.342511 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/4c64795e-faf7-45f0-8256-2ae8191e4e07-machine-approver-tls\") pod \"machine-approver-56656f9798-gcqw5\" (UID: \"4c64795e-faf7-45f0-8256-2ae8191e4e07\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.342496 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.342911 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4b935bba-4a22-4c28-b629-367f4a43bbc0-serving-cert\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.343565 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tfz52"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.343595 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/c464a7a2-0836-47e1-818d-aa8aaed4e657-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-h5w6b\" (UID: \"c464a7a2-0836-47e1-818d-aa8aaed4e657\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.345910 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t92fx"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.345935 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/ef05c38c-a0a0-40b6-9f4c-351768f9b547-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zppll\" (UID: \"ef05c38c-a0a0-40b6-9f4c-351768f9b547\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.346292 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/520dd40a-8e43-4dc0-abf7-2c2b4873b1ab-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-5rbsl\" (UID: \"520dd40a-8e43-4dc0-abf7-2c2b4873b1ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.347229 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.348680 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.348770 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xffcl"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.349385 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.351581 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.351837 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.352017 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.352425 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.365395 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.365516 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.367380 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.370152 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mg8jq"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.371593 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4sxh8"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.373374 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.375565 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.377259 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.379617 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-72qz9"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.383431 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hlk8z"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.384895 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.385428 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rznpq"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.387547 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.389474 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-f7mlx"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.390417 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-f7mlx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.391256 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ngmbx"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.392227 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ngmbx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.392881 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.394462 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.396749 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jjr22"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.398806 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vdwzh"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.400411 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.401565 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.402670 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tqc9f"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.403712 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wrjvf"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.404338 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.404917 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.406104 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jqnj8"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.407401 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.408443 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.409518 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pjb9r"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.410524 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.412794 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.414178 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9vt68"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.415730 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-vmvmj"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.416919 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vmvmj" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.417327 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ngmbx"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.419148 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vmvmj"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.424108 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.424671 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426323 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e114b26c-9613-4582-bfa4-d2ae0bc2e388-metrics-tls\") pod \"ingress-operator-5b745b69d9-kqg95\" (UID: \"e114b26c-9613-4582-bfa4-d2ae0bc2e388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426380 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4cm2\" (UniqueName: \"kubernetes.io/projected/175f6c4b-7354-47b4-b9d1-5867e001bb88-kube-api-access-d4cm2\") pod \"kube-storage-version-migrator-operator-b67b599dd-v744w\" (UID: \"175f6c4b-7354-47b4-b9d1-5867e001bb88\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426425 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zd6mx\" (UniqueName: \"kubernetes.io/projected/7d587445-25fe-458a-8157-47d510ef09d4-kube-api-access-zd6mx\") pod \"multus-admission-controller-857f4d67dd-jqnj8\" (UID: \"7d587445-25fe-458a-8157-47d510ef09d4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jqnj8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426538 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69jd2\" (UniqueName: \"kubernetes.io/projected/00b31185-ee2b-4482-99c1-f126457bd724-kube-api-access-69jd2\") pod \"packageserver-d55dfcdfc-qbcpm\" (UID: \"00b31185-ee2b-4482-99c1-f126457bd724\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426565 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a38fa71e-fb6f-4742-9291-0038d946cfec-console-config\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426601 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175f6c4b-7354-47b4-b9d1-5867e001bb88-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v744w\" (UID: \"175f6c4b-7354-47b4-b9d1-5867e001bb88\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426628 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxq4s\" (UniqueName: \"kubernetes.io/projected/86cad552-e907-4741-9709-e3952fcf470a-kube-api-access-xxq4s\") pod \"collect-profiles-29492145-r8sxx\" (UID: \"86cad552-e907-4741-9709-e3952fcf470a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426648 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49a98c2a-bf95-449c-8fee-d2b7ed4381db-metrics-certs\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426672 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b76ed054-9d3c-4679-ac70-d226ec5ec51a-serving-cert\") pod \"openshift-config-operator-7777fb866f-72qz9\" (UID: \"b76ed054-9d3c-4679-ac70-d226ec5ec51a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426690 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e114b26c-9613-4582-bfa4-d2ae0bc2e388-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kqg95\" (UID: \"e114b26c-9613-4582-bfa4-d2ae0bc2e388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426713 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c69cw\" (UniqueName: \"kubernetes.io/projected/b76ed054-9d3c-4679-ac70-d226ec5ec51a-kube-api-access-c69cw\") pod \"openshift-config-operator-7777fb866f-72qz9\" (UID: \"b76ed054-9d3c-4679-ac70-d226ec5ec51a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426748 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b77a577-edba-4a31-9885-ae11f403595a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8jtx5\" (UID: \"5b77a577-edba-4a31-9885-ae11f403595a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426768 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/49a98c2a-bf95-449c-8fee-d2b7ed4381db-default-certificate\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426787 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49a98c2a-bf95-449c-8fee-d2b7ed4381db-service-ca-bundle\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426803 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a38fa71e-fb6f-4742-9291-0038d946cfec-console-oauth-config\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426819 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5ed86dba-52b7-4652-91c7-aea3c8def1fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hlk8z\" (UID: \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426839 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e114b26c-9613-4582-bfa4-d2ae0bc2e388-trusted-ca\") pod \"ingress-operator-5b745b69d9-kqg95\" (UID: \"e114b26c-9613-4582-bfa4-d2ae0bc2e388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426854 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/49a98c2a-bf95-449c-8fee-d2b7ed4381db-stats-auth\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426879 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00b31185-ee2b-4482-99c1-f126457bd724-webhook-cert\") pod \"packageserver-d55dfcdfc-qbcpm\" (UID: \"00b31185-ee2b-4482-99c1-f126457bd724\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426894 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a38fa71e-fb6f-4742-9291-0038d946cfec-service-ca\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426913 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzrdr\" (UniqueName: \"kubernetes.io/projected/f57fea35-d5a7-43ca-843f-c8b81ca8bf69-kube-api-access-fzrdr\") pod \"console-operator-58897d9998-tfz52\" (UID: \"f57fea35-d5a7-43ca-843f-c8b81ca8bf69\") " pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426930 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/15eba547-a66e-48be-9d4e-fb4262515042-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tgdd8\" (UID: \"15eba547-a66e-48be-9d4e-fb4262515042\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426947 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw7qs\" (UniqueName: \"kubernetes.io/projected/e114b26c-9613-4582-bfa4-d2ae0bc2e388-kube-api-access-sw7qs\") pod \"ingress-operator-5b745b69d9-kqg95\" (UID: \"e114b26c-9613-4582-bfa4-d2ae0bc2e388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426971 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f57fea35-d5a7-43ca-843f-c8b81ca8bf69-trusted-ca\") pod \"console-operator-58897d9998-tfz52\" (UID: \"f57fea35-d5a7-43ca-843f-c8b81ca8bf69\") " pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.426994 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b77a577-edba-4a31-9885-ae11f403595a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8jtx5\" (UID: \"5b77a577-edba-4a31-9885-ae11f403595a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427022 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86cad552-e907-4741-9709-e3952fcf470a-config-volume\") pod \"collect-profiles-29492145-r8sxx\" (UID: \"86cad552-e907-4741-9709-e3952fcf470a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427060 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00b31185-ee2b-4482-99c1-f126457bd724-apiservice-cert\") pod \"packageserver-d55dfcdfc-qbcpm\" (UID: \"00b31185-ee2b-4482-99c1-f126457bd724\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427078 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175f6c4b-7354-47b4-b9d1-5867e001bb88-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v744w\" (UID: \"175f6c4b-7354-47b4-b9d1-5867e001bb88\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427094 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f57fea35-d5a7-43ca-843f-c8b81ca8bf69-config\") pod \"console-operator-58897d9998-tfz52\" (UID: \"f57fea35-d5a7-43ca-843f-c8b81ca8bf69\") " pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427115 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klw29\" (UniqueName: \"kubernetes.io/projected/7c0aa49a-697a-414c-94e5-fb0594c86d8b-kube-api-access-klw29\") pod \"dns-operator-744455d44c-mg8jq\" (UID: \"7c0aa49a-697a-414c-94e5-fb0594c86d8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-mg8jq" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427133 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/00b31185-ee2b-4482-99c1-f126457bd724-tmpfs\") pod \"packageserver-d55dfcdfc-qbcpm\" (UID: \"00b31185-ee2b-4482-99c1-f126457bd724\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427160 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b77a577-edba-4a31-9885-ae11f403595a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8jtx5\" (UID: \"5b77a577-edba-4a31-9885-ae11f403595a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427175 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tx6jv\" (UniqueName: \"kubernetes.io/projected/15eba547-a66e-48be-9d4e-fb4262515042-kube-api-access-tx6jv\") pod \"machine-config-controller-84d6567774-tgdd8\" (UID: \"15eba547-a66e-48be-9d4e-fb4262515042\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427196 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbndq\" (UniqueName: \"kubernetes.io/projected/a38fa71e-fb6f-4742-9291-0038d946cfec-kube-api-access-rbndq\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427243 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ed86dba-52b7-4652-91c7-aea3c8def1fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hlk8z\" (UID: \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427259 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xc52l\" (UniqueName: \"kubernetes.io/projected/5ed86dba-52b7-4652-91c7-aea3c8def1fc-kube-api-access-xc52l\") pod \"marketplace-operator-79b997595-hlk8z\" (UID: \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427274 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86cad552-e907-4741-9709-e3952fcf470a-secret-volume\") pod \"collect-profiles-29492145-r8sxx\" (UID: \"86cad552-e907-4741-9709-e3952fcf470a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427316 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15eba547-a66e-48be-9d4e-fb4262515042-proxy-tls\") pod \"machine-config-controller-84d6567774-tgdd8\" (UID: \"15eba547-a66e-48be-9d4e-fb4262515042\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427332 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rszx6\" (UniqueName: \"kubernetes.io/projected/49a98c2a-bf95-449c-8fee-d2b7ed4381db-kube-api-access-rszx6\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427353 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a38fa71e-fb6f-4742-9291-0038d946cfec-console-serving-cert\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427373 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b76ed054-9d3c-4679-ac70-d226ec5ec51a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-72qz9\" (UID: \"b76ed054-9d3c-4679-ac70-d226ec5ec51a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427389 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c0aa49a-697a-414c-94e5-fb0594c86d8b-metrics-tls\") pod \"dns-operator-744455d44c-mg8jq\" (UID: \"7c0aa49a-697a-414c-94e5-fb0594c86d8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-mg8jq" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427411 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/77bed8c1-f317-4117-8122-2bae18d42813-profile-collector-cert\") pod \"catalog-operator-68c6474976-vkk8k\" (UID: \"77bed8c1-f317-4117-8122-2bae18d42813\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427429 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmgcz\" (UniqueName: \"kubernetes.io/projected/77bed8c1-f317-4117-8122-2bae18d42813-kube-api-access-rmgcz\") pod \"catalog-operator-68c6474976-vkk8k\" (UID: \"77bed8c1-f317-4117-8122-2bae18d42813\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427447 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7d587445-25fe-458a-8157-47d510ef09d4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jqnj8\" (UID: \"7d587445-25fe-458a-8157-47d510ef09d4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jqnj8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427465 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qzc9\" (UniqueName: \"kubernetes.io/projected/b27ad314-02e7-4170-8921-a531184c006d-kube-api-access-7qzc9\") pod \"migrator-59844c95c7-tqc9f\" (UID: \"b27ad314-02e7-4170-8921-a531184c006d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqc9f" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427482 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a38fa71e-fb6f-4742-9291-0038d946cfec-oauth-serving-cert\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427510 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f57fea35-d5a7-43ca-843f-c8b81ca8bf69-serving-cert\") pod \"console-operator-58897d9998-tfz52\" (UID: \"f57fea35-d5a7-43ca-843f-c8b81ca8bf69\") " pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427524 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/77bed8c1-f317-4117-8122-2bae18d42813-srv-cert\") pod \"catalog-operator-68c6474976-vkk8k\" (UID: \"77bed8c1-f317-4117-8122-2bae18d42813\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427547 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a38fa71e-fb6f-4742-9291-0038d946cfec-trusted-ca-bundle\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.427771 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/a38fa71e-fb6f-4742-9291-0038d946cfec-console-config\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.428662 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a38fa71e-fb6f-4742-9291-0038d946cfec-trusted-ca-bundle\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.428734 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e114b26c-9613-4582-bfa4-d2ae0bc2e388-trusted-ca\") pod \"ingress-operator-5b745b69d9-kqg95\" (UID: \"e114b26c-9613-4582-bfa4-d2ae0bc2e388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.431748 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e114b26c-9613-4582-bfa4-d2ae0bc2e388-metrics-tls\") pod \"ingress-operator-5b745b69d9-kqg95\" (UID: \"e114b26c-9613-4582-bfa4-d2ae0bc2e388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.432024 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/15eba547-a66e-48be-9d4e-fb4262515042-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-tgdd8\" (UID: \"15eba547-a66e-48be-9d4e-fb4262515042\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.432796 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/00b31185-ee2b-4482-99c1-f126457bd724-tmpfs\") pod \"packageserver-d55dfcdfc-qbcpm\" (UID: \"00b31185-ee2b-4482-99c1-f126457bd724\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.432868 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b77a577-edba-4a31-9885-ae11f403595a-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8jtx5\" (UID: \"5b77a577-edba-4a31-9885-ae11f403595a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.433623 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f57fea35-d5a7-43ca-843f-c8b81ca8bf69-config\") pod \"console-operator-58897d9998-tfz52\" (UID: \"f57fea35-d5a7-43ca-843f-c8b81ca8bf69\") " pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.434052 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f57fea35-d5a7-43ca-843f-c8b81ca8bf69-trusted-ca\") pod \"console-operator-58897d9998-tfz52\" (UID: \"f57fea35-d5a7-43ca-843f-c8b81ca8bf69\") " pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.434105 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/a38fa71e-fb6f-4742-9291-0038d946cfec-service-ca\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.434173 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/a38fa71e-fb6f-4742-9291-0038d946cfec-oauth-serving-cert\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.434217 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/b76ed054-9d3c-4679-ac70-d226ec5ec51a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-72qz9\" (UID: \"b76ed054-9d3c-4679-ac70-d226ec5ec51a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.435205 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c0aa49a-697a-414c-94e5-fb0594c86d8b-metrics-tls\") pod \"dns-operator-744455d44c-mg8jq\" (UID: \"7c0aa49a-697a-414c-94e5-fb0594c86d8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-mg8jq" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.435324 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/a38fa71e-fb6f-4742-9291-0038d946cfec-console-serving-cert\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.436232 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f57fea35-d5a7-43ca-843f-c8b81ca8bf69-serving-cert\") pod \"console-operator-58897d9998-tfz52\" (UID: \"f57fea35-d5a7-43ca-843f-c8b81ca8bf69\") " pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.436263 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b77a577-edba-4a31-9885-ae11f403595a-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8jtx5\" (UID: \"5b77a577-edba-4a31-9885-ae11f403595a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.439287 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/a38fa71e-fb6f-4742-9291-0038d946cfec-console-oauth-config\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.444467 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.464070 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.483950 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.504155 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.526169 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.545611 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.565764 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.583883 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.604356 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.617401 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-5cl72"] Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.624556 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.643666 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.657382 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/15eba547-a66e-48be-9d4e-fb4262515042-proxy-tls\") pod \"machine-config-controller-84d6567774-tgdd8\" (UID: \"15eba547-a66e-48be-9d4e-fb4262515042\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.664262 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.684253 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.704393 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.724274 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.745360 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.764850 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.784735 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.804711 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.816237 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5ed86dba-52b7-4652-91c7-aea3c8def1fc-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-hlk8z\" (UID: \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.829642 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.832242 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ed86dba-52b7-4652-91c7-aea3c8def1fc-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-hlk8z\" (UID: \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.844483 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.864837 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.884880 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.893499 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/49a98c2a-bf95-449c-8fee-d2b7ed4381db-stats-auth\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.904424 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.912927 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/49a98c2a-bf95-449c-8fee-d2b7ed4381db-metrics-certs\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.924592 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.945981 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.954613 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/49a98c2a-bf95-449c-8fee-d2b7ed4381db-default-certificate\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.964063 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.969890 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49a98c2a-bf95-449c-8fee-d2b7ed4381db-service-ca-bundle\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:31 crc kubenswrapper[4713]: I0127 15:49:31.984377 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.004960 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.024542 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.044794 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.064267 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.084016 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.096946 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/7d587445-25fe-458a-8157-47d510ef09d4-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-jqnj8\" (UID: \"7d587445-25fe-458a-8157-47d510ef09d4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jqnj8" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.104457 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.124173 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.147762 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.156232 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/77bed8c1-f317-4117-8122-2bae18d42813-srv-cert\") pod \"catalog-operator-68c6474976-vkk8k\" (UID: \"77bed8c1-f317-4117-8122-2bae18d42813\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.164300 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.175801 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86cad552-e907-4741-9709-e3952fcf470a-secret-volume\") pod \"collect-profiles-29492145-r8sxx\" (UID: \"86cad552-e907-4741-9709-e3952fcf470a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.175953 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/77bed8c1-f317-4117-8122-2bae18d42813-profile-collector-cert\") pod \"catalog-operator-68c6474976-vkk8k\" (UID: \"77bed8c1-f317-4117-8122-2bae18d42813\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.184644 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.204414 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.224094 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.244031 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.264848 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.284417 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.292959 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/175f6c4b-7354-47b4-b9d1-5867e001bb88-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-v744w\" (UID: \"175f6c4b-7354-47b4-b9d1-5867e001bb88\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.302456 4713 request.go:700] Waited for 1.010009356s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-storage-version-migrator-operator/configmaps?fieldSelector=metadata.name%3Dconfig&limit=500&resourceVersion=0 Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.304554 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.312558 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/175f6c4b-7354-47b4-b9d1-5867e001bb88-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-v744w\" (UID: \"175f6c4b-7354-47b4-b9d1-5867e001bb88\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.324661 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.344175 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.364466 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.385293 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.394279 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b76ed054-9d3c-4679-ac70-d226ec5ec51a-serving-cert\") pod \"openshift-config-operator-7777fb866f-72qz9\" (UID: \"b76ed054-9d3c-4679-ac70-d226ec5ec51a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.405014 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.424823 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: E0127 15:49:32.428947 4713 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Jan 27 15:49:32 crc kubenswrapper[4713]: E0127 15:49:32.429433 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b31185-ee2b-4482-99c1-f126457bd724-webhook-cert podName:00b31185-ee2b-4482-99c1-f126457bd724 nodeName:}" failed. No retries permitted until 2026-01-27 15:49:32.929397259 +0000 UTC m=+140.707607197 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/00b31185-ee2b-4482-99c1-f126457bd724-webhook-cert") pod "packageserver-d55dfcdfc-qbcpm" (UID: "00b31185-ee2b-4482-99c1-f126457bd724") : failed to sync secret cache: timed out waiting for the condition Jan 27 15:49:32 crc kubenswrapper[4713]: E0127 15:49:32.433353 4713 secret.go:188] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Jan 27 15:49:32 crc kubenswrapper[4713]: E0127 15:49:32.433484 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00b31185-ee2b-4482-99c1-f126457bd724-apiservice-cert podName:00b31185-ee2b-4482-99c1-f126457bd724 nodeName:}" failed. No retries permitted until 2026-01-27 15:49:32.933452546 +0000 UTC m=+140.711662484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/00b31185-ee2b-4482-99c1-f126457bd724-apiservice-cert") pod "packageserver-d55dfcdfc-qbcpm" (UID: "00b31185-ee2b-4482-99c1-f126457bd724") : failed to sync secret cache: timed out waiting for the condition Jan 27 15:49:32 crc kubenswrapper[4713]: E0127 15:49:32.433488 4713 configmap.go:193] Couldn't get configMap openshift-operator-lifecycle-manager/collect-profiles-config: failed to sync configmap cache: timed out waiting for the condition Jan 27 15:49:32 crc kubenswrapper[4713]: E0127 15:49:32.433646 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/86cad552-e907-4741-9709-e3952fcf470a-config-volume podName:86cad552-e907-4741-9709-e3952fcf470a nodeName:}" failed. No retries permitted until 2026-01-27 15:49:32.93361736 +0000 UTC m=+140.711827298 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/86cad552-e907-4741-9709-e3952fcf470a-config-volume") pod "collect-profiles-29492145-r8sxx" (UID: "86cad552-e907-4741-9709-e3952fcf470a") : failed to sync configmap cache: timed out waiting for the condition Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.443968 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.465516 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.484347 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.504829 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.524271 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.543778 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.565031 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.584874 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.605306 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.607955 4713 generic.go:334] "Generic (PLEG): container finished" podID="f0905e52-1e55-4a6b-887a-6680bd4d2004" containerID="07fc5b2501a3d0d18b0247c30c9a24e83ffeb6e3a0efcf776bdae3287fe6ca2f" exitCode=0 Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.608010 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5cl72" event={"ID":"f0905e52-1e55-4a6b-887a-6680bd4d2004","Type":"ContainerDied","Data":"07fc5b2501a3d0d18b0247c30c9a24e83ffeb6e3a0efcf776bdae3287fe6ca2f"} Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.608070 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5cl72" event={"ID":"f0905e52-1e55-4a6b-887a-6680bd4d2004","Type":"ContainerStarted","Data":"51184a078dce1f8588e6cfaad3cf4915b2a9c4340dce12433154e2f487bdbd37"} Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.623985 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.644005 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.665400 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.684729 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.705154 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.724963 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.747394 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.764793 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.785665 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.804648 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.826441 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.844558 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.863888 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.883517 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.906568 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.925128 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.950610 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86cad552-e907-4741-9709-e3952fcf470a-config-volume\") pod \"collect-profiles-29492145-r8sxx\" (UID: \"86cad552-e907-4741-9709-e3952fcf470a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.950674 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00b31185-ee2b-4482-99c1-f126457bd724-apiservice-cert\") pod \"packageserver-d55dfcdfc-qbcpm\" (UID: \"00b31185-ee2b-4482-99c1-f126457bd724\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.950981 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00b31185-ee2b-4482-99c1-f126457bd724-webhook-cert\") pod \"packageserver-d55dfcdfc-qbcpm\" (UID: \"00b31185-ee2b-4482-99c1-f126457bd724\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.951946 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86cad552-e907-4741-9709-e3952fcf470a-config-volume\") pod \"collect-profiles-29492145-r8sxx\" (UID: \"86cad552-e907-4741-9709-e3952fcf470a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.955166 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/00b31185-ee2b-4482-99c1-f126457bd724-webhook-cert\") pod \"packageserver-d55dfcdfc-qbcpm\" (UID: \"00b31185-ee2b-4482-99c1-f126457bd724\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.955380 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/00b31185-ee2b-4482-99c1-f126457bd724-apiservice-cert\") pod \"packageserver-d55dfcdfc-qbcpm\" (UID: \"00b31185-ee2b-4482-99c1-f126457bd724\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.964666 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 27 15:49:32 crc kubenswrapper[4713]: I0127 15:49:32.983612 4713 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.004320 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.041555 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9n45\" (UniqueName: \"kubernetes.io/projected/c464a7a2-0836-47e1-818d-aa8aaed4e657-kube-api-access-n9n45\") pod \"cluster-image-registry-operator-dc59b4c8b-h5w6b\" (UID: \"c464a7a2-0836-47e1-818d-aa8aaed4e657\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.061231 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjp74\" (UniqueName: \"kubernetes.io/projected/4c64795e-faf7-45f0-8256-2ae8191e4e07-kube-api-access-cjp74\") pod \"machine-approver-56656f9798-gcqw5\" (UID: \"4c64795e-faf7-45f0-8256-2ae8191e4e07\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.080346 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-628z8\" (UniqueName: \"kubernetes.io/projected/01f5e6a0-654f-41cb-a694-2c86ca8523d8-kube-api-access-628z8\") pod \"downloads-7954f5f757-mkwg6\" (UID: \"01f5e6a0-654f-41cb-a694-2c86ca8523d8\") " pod="openshift-console/downloads-7954f5f757-mkwg6" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.085710 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.101246 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6r4t\" (UniqueName: \"kubernetes.io/projected/aae489c7-6a71-4465-9c25-7d27eb68b318-kube-api-access-k6r4t\") pod \"route-controller-manager-6576b87f9c-h5jwm\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:33 crc kubenswrapper[4713]: W0127 15:49:33.103854 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c64795e_faf7_45f0_8256_2ae8191e4e07.slice/crio-cf4f63c51b7b85345e4130af28c511dea30915eef4b4a7c7464cbfa3b271d8bf WatchSource:0}: Error finding container cf4f63c51b7b85345e4130af28c511dea30915eef4b4a7c7464cbfa3b271d8bf: Status 404 returned error can't find the container with id cf4f63c51b7b85345e4130af28c511dea30915eef4b4a7c7464cbfa3b271d8bf Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.119673 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwtzh\" (UniqueName: \"kubernetes.io/projected/520dd40a-8e43-4dc0-abf7-2c2b4873b1ab-kube-api-access-vwtzh\") pod \"openshift-apiserver-operator-796bbdcf4f-5rbsl\" (UID: \"520dd40a-8e43-4dc0-abf7-2c2b4873b1ab\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.141158 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c464a7a2-0836-47e1-818d-aa8aaed4e657-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-h5w6b\" (UID: \"c464a7a2-0836-47e1-818d-aa8aaed4e657\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.159640 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rc4t\" (UniqueName: \"kubernetes.io/projected/4b935bba-4a22-4c28-b629-367f4a43bbc0-kube-api-access-8rc4t\") pod \"authentication-operator-69f744f599-wrjvf\" (UID: \"4b935bba-4a22-4c28-b629-367f4a43bbc0\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.180911 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.184130 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grm2d\" (UniqueName: \"kubernetes.io/projected/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-kube-api-access-grm2d\") pod \"oauth-openshift-558db77b4-t92fx\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.201393 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt9m5\" (UniqueName: \"kubernetes.io/projected/e411ddc4-0c8a-4cae-b08d-264bae41dffc-kube-api-access-kt9m5\") pod \"machine-api-operator-5694c8668f-4sxh8\" (UID: \"e411ddc4-0c8a-4cae-b08d-264bae41dffc\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.221770 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjn9c\" (UniqueName: \"kubernetes.io/projected/ef05c38c-a0a0-40b6-9f4c-351768f9b547-kube-api-access-jjn9c\") pod \"cluster-samples-operator-665b6dd947-zppll\" (UID: \"ef05c38c-a0a0-40b6-9f4c-351768f9b547\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.238219 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.245124 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtzfz\" (UniqueName: \"kubernetes.io/projected/ad390cc5-3b01-4343-97a2-2c4385fe5142-kube-api-access-rtzfz\") pod \"controller-manager-879f6c89f-7vcrd\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.257927 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-mkwg6" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.264999 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.284550 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.302645 4713 request.go:700] Waited for 1.911892588s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-tls&limit=500&resourceVersion=0 Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.305101 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.325910 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.325919 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.326395 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.344196 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.365651 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.384496 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.390375 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl"] Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.399958 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.405379 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.411324 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.421971 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.427469 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.444830 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.484756 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.488247 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd6mx\" (UniqueName: \"kubernetes.io/projected/7d587445-25fe-458a-8157-47d510ef09d4-kube-api-access-zd6mx\") pod \"multus-admission-controller-857f4d67dd-jqnj8\" (UID: \"7d587445-25fe-458a-8157-47d510ef09d4\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-jqnj8" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.519170 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4cm2\" (UniqueName: \"kubernetes.io/projected/175f6c4b-7354-47b4-b9d1-5867e001bb88-kube-api-access-d4cm2\") pod \"kube-storage-version-migrator-operator-b67b599dd-v744w\" (UID: \"175f6c4b-7354-47b4-b9d1-5867e001bb88\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.528694 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c69cw\" (UniqueName: \"kubernetes.io/projected/b76ed054-9d3c-4679-ac70-d226ec5ec51a-kube-api-access-c69cw\") pod \"openshift-config-operator-7777fb866f-72qz9\" (UID: \"b76ed054-9d3c-4679-ac70-d226ec5ec51a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.533673 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm"] Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.544686 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxq4s\" (UniqueName: \"kubernetes.io/projected/86cad552-e907-4741-9709-e3952fcf470a-kube-api-access-xxq4s\") pod \"collect-profiles-29492145-r8sxx\" (UID: \"86cad552-e907-4741-9709-e3952fcf470a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.565812 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e114b26c-9613-4582-bfa4-d2ae0bc2e388-bound-sa-token\") pod \"ingress-operator-5b745b69d9-kqg95\" (UID: \"e114b26c-9613-4582-bfa4-d2ae0bc2e388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.583695 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzrdr\" (UniqueName: \"kubernetes.io/projected/f57fea35-d5a7-43ca-843f-c8b81ca8bf69-kube-api-access-fzrdr\") pod \"console-operator-58897d9998-tfz52\" (UID: \"f57fea35-d5a7-43ca-843f-c8b81ca8bf69\") " pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.603234 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc52l\" (UniqueName: \"kubernetes.io/projected/5ed86dba-52b7-4652-91c7-aea3c8def1fc-kube-api-access-xc52l\") pod \"marketplace-operator-79b997595-hlk8z\" (UID: \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\") " pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.630142 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw7qs\" (UniqueName: \"kubernetes.io/projected/e114b26c-9613-4582-bfa4-d2ae0bc2e388-kube-api-access-sw7qs\") pod \"ingress-operator-5b745b69d9-kqg95\" (UID: \"e114b26c-9613-4582-bfa4-d2ae0bc2e388\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.634437 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" event={"ID":"4c64795e-faf7-45f0-8256-2ae8191e4e07","Type":"ContainerStarted","Data":"96091cd70fc9641499bdbe4b38f59fe94d6f61d7c28c8e04cea17e22803b758e"} Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.634504 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" event={"ID":"4c64795e-faf7-45f0-8256-2ae8191e4e07","Type":"ContainerStarted","Data":"cf4f63c51b7b85345e4130af28c511dea30915eef4b4a7c7464cbfa3b271d8bf"} Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.634705 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.644722 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-jqnj8" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.658112 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5cl72" event={"ID":"f0905e52-1e55-4a6b-887a-6680bd4d2004","Type":"ContainerStarted","Data":"db24e4cd561de59171f6ccd8f14627ef6909df385f79643583f466299888ba1f"} Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.658192 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-5cl72" event={"ID":"f0905e52-1e55-4a6b-887a-6680bd4d2004","Type":"ContainerStarted","Data":"7dc510e01a423cec1a03ad0a67885f2cdb7997472a8b768e9305fb9ebd8d63db"} Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.664842 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rszx6\" (UniqueName: \"kubernetes.io/projected/49a98c2a-bf95-449c-8fee-d2b7ed4381db-kube-api-access-rszx6\") pod \"router-default-5444994796-xdbd4\" (UID: \"49a98c2a-bf95-449c-8fee-d2b7ed4381db\") " pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.671342 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" event={"ID":"aae489c7-6a71-4465-9c25-7d27eb68b318","Type":"ContainerStarted","Data":"804ba3da8821a4db1b9097d1e82fde3602625b707cb58dfdc7606edfd52bd3ad"} Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.673886 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qzc9\" (UniqueName: \"kubernetes.io/projected/b27ad314-02e7-4170-8921-a531184c006d-kube-api-access-7qzc9\") pod \"migrator-59844c95c7-tqc9f\" (UID: \"b27ad314-02e7-4170-8921-a531184c006d\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqc9f" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.676704 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl" event={"ID":"520dd40a-8e43-4dc0-abf7-2c2b4873b1ab","Type":"ContainerStarted","Data":"4f98c96f095729a3fc808105162296968c13566b4d232c4fde73c74ac61bafef"} Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.689626 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.694738 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.698858 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmgcz\" (UniqueName: \"kubernetes.io/projected/77bed8c1-f317-4117-8122-2bae18d42813-kube-api-access-rmgcz\") pod \"catalog-operator-68c6474976-vkk8k\" (UID: \"77bed8c1-f317-4117-8122-2bae18d42813\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.706500 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69jd2\" (UniqueName: \"kubernetes.io/projected/00b31185-ee2b-4482-99c1-f126457bd724-kube-api-access-69jd2\") pod \"packageserver-d55dfcdfc-qbcpm\" (UID: \"00b31185-ee2b-4482-99c1-f126457bd724\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.733463 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.735069 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tx6jv\" (UniqueName: \"kubernetes.io/projected/15eba547-a66e-48be-9d4e-fb4262515042-kube-api-access-tx6jv\") pod \"machine-config-controller-84d6567774-tgdd8\" (UID: \"15eba547-a66e-48be-9d4e-fb4262515042\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.753303 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbndq\" (UniqueName: \"kubernetes.io/projected/a38fa71e-fb6f-4742-9291-0038d946cfec-kube-api-access-rbndq\") pod \"console-f9d7485db-xffcl\" (UID: \"a38fa71e-fb6f-4742-9291-0038d946cfec\") " pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.759145 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqc9f" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.765258 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b77a577-edba-4a31-9885-ae11f403595a-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-8jtx5\" (UID: \"5b77a577-edba-4a31-9885-ae11f403595a\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.777740 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.782445 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klw29\" (UniqueName: \"kubernetes.io/projected/7c0aa49a-697a-414c-94e5-fb0594c86d8b-kube-api-access-klw29\") pod \"dns-operator-744455d44c-mg8jq\" (UID: \"7c0aa49a-697a-414c-94e5-fb0594c86d8b\") " pod="openshift-dns-operator/dns-operator-744455d44c-mg8jq" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.792815 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.803149 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.822831 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-mg8jq" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.834492 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.858552 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.876817 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce473d77-3607-455d-b44d-f70055fb5178-config\") pod \"kube-apiserver-operator-766d6c64bb-txzhj\" (UID: \"ce473d77-3607-455d-b44d-f70055fb5178\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.876872 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-registry-tls\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.876899 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ff13383-b1fe-4905-81f2-6921bcb452fb-config\") pod \"service-ca-operator-777779d784-wbnzn\" (UID: \"5ff13383-b1fe-4905-81f2-6921bcb452fb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.876955 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c18a7bbc-d9d4-4020-b001-e2c97a70f7da-images\") pod \"machine-config-operator-74547568cd-gqldd\" (UID: \"c18a7bbc-d9d4-4020-b001-e2c97a70f7da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.876984 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bfe545e4-37ec-41d0-aad6-d8a67025503d-srv-cert\") pod \"olm-operator-6b444d44fb-nkkpk\" (UID: \"bfe545e4-37ec-41d0-aad6-d8a67025503d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877030 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/01fcd844-681c-415a-ad44-deaecdf5dc67-etcd-client\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877086 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/01fcd844-681c-415a-ad44-deaecdf5dc67-etcd-service-ca\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877150 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/474cf50a-fd9d-48f5-8a91-69a32f459043-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zfnxn\" (UID: \"474cf50a-fd9d-48f5-8a91-69a32f459043\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877191 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8k27x\" (UniqueName: \"kubernetes.io/projected/c6936ff4-89a3-4627-aa92-34a950e2ee0f-kube-api-access-8k27x\") pod \"control-plane-machine-set-operator-78cbb6b69f-rznpq\" (UID: \"c6936ff4-89a3-4627-aa92-34a950e2ee0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rznpq" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877219 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrjsr\" (UniqueName: \"kubernetes.io/projected/19b94a05-aa83-4939-828b-8715ccc15dd7-kube-api-access-zrjsr\") pod \"service-ca-9c57cc56f-pjb9r\" (UID: \"19b94a05-aa83-4939-828b-8715ccc15dd7\") " pod="openshift-service-ca/service-ca-9c57cc56f-pjb9r" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877255 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ff13383-b1fe-4905-81f2-6921bcb452fb-serving-cert\") pod \"service-ca-operator-777779d784-wbnzn\" (UID: \"5ff13383-b1fe-4905-81f2-6921bcb452fb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877278 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ce473d77-3607-455d-b44d-f70055fb5178-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-txzhj\" (UID: \"ce473d77-3607-455d-b44d-f70055fb5178\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877307 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/01fcd844-681c-415a-ad44-deaecdf5dc67-etcd-ca\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877350 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbct5\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-kube-api-access-hbct5\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877395 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4994bfc4-7e92-4877-a981-6d94b4df000a-registry-certificates\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877430 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877472 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gv9m\" (UniqueName: \"kubernetes.io/projected/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-kube-api-access-4gv9m\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877496 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtl5c\" (UniqueName: \"kubernetes.io/projected/bfe545e4-37ec-41d0-aad6-d8a67025503d-kube-api-access-wtl5c\") pod \"olm-operator-6b444d44fb-nkkpk\" (UID: \"bfe545e4-37ec-41d0-aad6-d8a67025503d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877519 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-bound-sa-token\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877552 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4994bfc4-7e92-4877-a981-6d94b4df000a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877602 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-etcd-client\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877625 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6qb6\" (UniqueName: \"kubernetes.io/projected/5ff13383-b1fe-4905-81f2-6921bcb452fb-kube-api-access-j6qb6\") pod \"service-ca-operator-777779d784-wbnzn\" (UID: \"5ff13383-b1fe-4905-81f2-6921bcb452fb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877648 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a5f2acf-c5af-4590-aa1e-5569d1b107bf-config\") pod \"kube-controller-manager-operator-78b949d7b-h68rh\" (UID: \"6a5f2acf-c5af-4590-aa1e-5569d1b107bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877679 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877709 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c18a7bbc-d9d4-4020-b001-e2c97a70f7da-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gqldd\" (UID: \"c18a7bbc-d9d4-4020-b001-e2c97a70f7da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877732 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-encryption-config\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877755 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877776 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbh7j\" (UniqueName: \"kubernetes.io/projected/01fcd844-681c-415a-ad44-deaecdf5dc67-kube-api-access-cbh7j\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877844 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a5f2acf-c5af-4590-aa1e-5569d1b107bf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-h68rh\" (UID: \"6a5f2acf-c5af-4590-aa1e-5569d1b107bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877884 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4994bfc4-7e92-4877-a981-6d94b4df000a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877920 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6936ff4-89a3-4627-aa92-34a950e2ee0f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rznpq\" (UID: \"c6936ff4-89a3-4627-aa92-34a950e2ee0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rznpq" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877960 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-audit-dir\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.877986 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xq6nw\" (UniqueName: \"kubernetes.io/projected/474cf50a-fd9d-48f5-8a91-69a32f459043-kube-api-access-xq6nw\") pod \"package-server-manager-789f6589d5-zfnxn\" (UID: \"474cf50a-fd9d-48f5-8a91-69a32f459043\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878056 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01fcd844-681c-415a-ad44-deaecdf5dc67-serving-cert\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878096 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fmpk\" (UniqueName: \"kubernetes.io/projected/c18a7bbc-d9d4-4020-b001-e2c97a70f7da-kube-api-access-6fmpk\") pod \"machine-config-operator-74547568cd-gqldd\" (UID: \"c18a7bbc-d9d4-4020-b001-e2c97a70f7da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878121 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/19b94a05-aa83-4939-828b-8715ccc15dd7-signing-cabundle\") pod \"service-ca-9c57cc56f-pjb9r\" (UID: \"19b94a05-aa83-4939-828b-8715ccc15dd7\") " pod="openshift-service-ca/service-ca-9c57cc56f-pjb9r" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878142 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-audit-policies\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878192 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lg4z\" (UniqueName: \"kubernetes.io/projected/3e878f77-b89c-469a-8c14-0df126919fb6-kube-api-access-6lg4z\") pod \"openshift-controller-manager-operator-756b6f6bc6-n8h7d\" (UID: \"3e878f77-b89c-469a-8c14-0df126919fb6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878288 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-serving-cert\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878327 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e878f77-b89c-469a-8c14-0df126919fb6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n8h7d\" (UID: \"3e878f77-b89c-469a-8c14-0df126919fb6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878352 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a5f2acf-c5af-4590-aa1e-5569d1b107bf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-h68rh\" (UID: \"6a5f2acf-c5af-4590-aa1e-5569d1b107bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878399 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c18a7bbc-d9d4-4020-b001-e2c97a70f7da-proxy-tls\") pod \"machine-config-operator-74547568cd-gqldd\" (UID: \"c18a7bbc-d9d4-4020-b001-e2c97a70f7da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878423 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/19b94a05-aa83-4939-828b-8715ccc15dd7-signing-key\") pod \"service-ca-9c57cc56f-pjb9r\" (UID: \"19b94a05-aa83-4939-828b-8715ccc15dd7\") " pod="openshift-service-ca/service-ca-9c57cc56f-pjb9r" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878447 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bfe545e4-37ec-41d0-aad6-d8a67025503d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nkkpk\" (UID: \"bfe545e4-37ec-41d0-aad6-d8a67025503d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878472 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4994bfc4-7e92-4877-a981-6d94b4df000a-trusted-ca\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878496 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce473d77-3607-455d-b44d-f70055fb5178-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-txzhj\" (UID: \"ce473d77-3607-455d-b44d-f70055fb5178\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878518 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01fcd844-681c-415a-ad44-deaecdf5dc67-config\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.878541 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e878f77-b89c-469a-8c14-0df126919fb6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n8h7d\" (UID: \"3e878f77-b89c-469a-8c14-0df126919fb6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d" Jan 27 15:49:33 crc kubenswrapper[4713]: E0127 15:49:33.883290 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:34.383273113 +0000 UTC m=+142.161483051 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.892172 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.893323 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-mkwg6"] Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.902142 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t92fx"] Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.954826 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.963781 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.972292 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-wrjvf"] Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.979905 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980176 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-serving-cert\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980217 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e878f77-b89c-469a-8c14-0df126919fb6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n8h7d\" (UID: \"3e878f77-b89c-469a-8c14-0df126919fb6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980260 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a5f2acf-c5af-4590-aa1e-5569d1b107bf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-h68rh\" (UID: \"6a5f2acf-c5af-4590-aa1e-5569d1b107bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980311 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4994bfc4-7e92-4877-a981-6d94b4df000a-trusted-ca\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980334 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c18a7bbc-d9d4-4020-b001-e2c97a70f7da-proxy-tls\") pod \"machine-config-operator-74547568cd-gqldd\" (UID: \"c18a7bbc-d9d4-4020-b001-e2c97a70f7da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980355 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/19b94a05-aa83-4939-828b-8715ccc15dd7-signing-key\") pod \"service-ca-9c57cc56f-pjb9r\" (UID: \"19b94a05-aa83-4939-828b-8715ccc15dd7\") " pod="openshift-service-ca/service-ca-9c57cc56f-pjb9r" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980379 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bfe545e4-37ec-41d0-aad6-d8a67025503d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nkkpk\" (UID: \"bfe545e4-37ec-41d0-aad6-d8a67025503d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980417 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce473d77-3607-455d-b44d-f70055fb5178-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-txzhj\" (UID: \"ce473d77-3607-455d-b44d-f70055fb5178\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980454 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01fcd844-681c-415a-ad44-deaecdf5dc67-config\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980476 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e878f77-b89c-469a-8c14-0df126919fb6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n8h7d\" (UID: \"3e878f77-b89c-469a-8c14-0df126919fb6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980501 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce473d77-3607-455d-b44d-f70055fb5178-config\") pod \"kube-apiserver-operator-766d6c64bb-txzhj\" (UID: \"ce473d77-3607-455d-b44d-f70055fb5178\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980553 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-registry-tls\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980599 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ff13383-b1fe-4905-81f2-6921bcb452fb-config\") pod \"service-ca-operator-777779d784-wbnzn\" (UID: \"5ff13383-b1fe-4905-81f2-6921bcb452fb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980634 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c18a7bbc-d9d4-4020-b001-e2c97a70f7da-images\") pod \"machine-config-operator-74547568cd-gqldd\" (UID: \"c18a7bbc-d9d4-4020-b001-e2c97a70f7da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980657 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bfe545e4-37ec-41d0-aad6-d8a67025503d-srv-cert\") pod \"olm-operator-6b444d44fb-nkkpk\" (UID: \"bfe545e4-37ec-41d0-aad6-d8a67025503d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980720 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/01fcd844-681c-415a-ad44-deaecdf5dc67-etcd-client\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980800 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/01fcd844-681c-415a-ad44-deaecdf5dc67-etcd-service-ca\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980850 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6ee31a47-41e4-4c20-9173-48a250b99c3c-metrics-tls\") pod \"dns-default-vmvmj\" (UID: \"6ee31a47-41e4-4c20-9173-48a250b99c3c\") " pod="openshift-dns/dns-default-vmvmj" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980930 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/474cf50a-fd9d-48f5-8a91-69a32f459043-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zfnxn\" (UID: \"474cf50a-fd9d-48f5-8a91-69a32f459043\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.980957 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8k27x\" (UniqueName: \"kubernetes.io/projected/c6936ff4-89a3-4627-aa92-34a950e2ee0f-kube-api-access-8k27x\") pod \"control-plane-machine-set-operator-78cbb6b69f-rznpq\" (UID: \"c6936ff4-89a3-4627-aa92-34a950e2ee0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rznpq" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.981005 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrjsr\" (UniqueName: \"kubernetes.io/projected/19b94a05-aa83-4939-828b-8715ccc15dd7-kube-api-access-zrjsr\") pod \"service-ca-9c57cc56f-pjb9r\" (UID: \"19b94a05-aa83-4939-828b-8715ccc15dd7\") " pod="openshift-service-ca/service-ca-9c57cc56f-pjb9r" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.981030 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-registration-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.981085 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ff13383-b1fe-4905-81f2-6921bcb452fb-serving-cert\") pod \"service-ca-operator-777779d784-wbnzn\" (UID: \"5ff13383-b1fe-4905-81f2-6921bcb452fb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.981114 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x9h7v\" (UniqueName: \"kubernetes.io/projected/475c8723-6375-4149-8cec-2e5ea00943e3-kube-api-access-x9h7v\") pod \"ingress-canary-ngmbx\" (UID: \"475c8723-6375-4149-8cec-2e5ea00943e3\") " pod="openshift-ingress-canary/ingress-canary-ngmbx" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.981156 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ce473d77-3607-455d-b44d-f70055fb5178-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-txzhj\" (UID: \"ce473d77-3607-455d-b44d-f70055fb5178\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.981216 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/01fcd844-681c-415a-ad44-deaecdf5dc67-etcd-ca\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.981255 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbct5\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-kube-api-access-hbct5\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.981285 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mq7s\" (UniqueName: \"kubernetes.io/projected/6ee31a47-41e4-4c20-9173-48a250b99c3c-kube-api-access-8mq7s\") pod \"dns-default-vmvmj\" (UID: \"6ee31a47-41e4-4c20-9173-48a250b99c3c\") " pod="openshift-dns/dns-default-vmvmj" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.981354 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4994bfc4-7e92-4877-a981-6d94b4df000a-registry-certificates\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.981390 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.981420 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gv9m\" (UniqueName: \"kubernetes.io/projected/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-kube-api-access-4gv9m\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.981445 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtl5c\" (UniqueName: \"kubernetes.io/projected/bfe545e4-37ec-41d0-aad6-d8a67025503d-kube-api-access-wtl5c\") pod \"olm-operator-6b444d44fb-nkkpk\" (UID: \"bfe545e4-37ec-41d0-aad6-d8a67025503d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983103 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4994bfc4-7e92-4877-a981-6d94b4df000a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983142 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-bound-sa-token\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983282 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-etcd-client\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983314 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-socket-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983355 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6qb6\" (UniqueName: \"kubernetes.io/projected/5ff13383-b1fe-4905-81f2-6921bcb452fb-kube-api-access-j6qb6\") pod \"service-ca-operator-777779d784-wbnzn\" (UID: \"5ff13383-b1fe-4905-81f2-6921bcb452fb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983382 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a5f2acf-c5af-4590-aa1e-5569d1b107bf-config\") pod \"kube-controller-manager-operator-78b949d7b-h68rh\" (UID: \"6a5f2acf-c5af-4590-aa1e-5569d1b107bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983516 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/475c8723-6375-4149-8cec-2e5ea00943e3-cert\") pod \"ingress-canary-ngmbx\" (UID: \"475c8723-6375-4149-8cec-2e5ea00943e3\") " pod="openshift-ingress-canary/ingress-canary-ngmbx" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983546 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-mountpoint-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983574 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c18a7bbc-d9d4-4020-b001-e2c97a70f7da-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gqldd\" (UID: \"c18a7bbc-d9d4-4020-b001-e2c97a70f7da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983586 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/01fcd844-681c-415a-ad44-deaecdf5dc67-etcd-service-ca\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983598 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-encryption-config\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983720 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ee31a47-41e4-4c20-9173-48a250b99c3c-config-volume\") pod \"dns-default-vmvmj\" (UID: \"6ee31a47-41e4-4c20-9173-48a250b99c3c\") " pod="openshift-dns/dns-default-vmvmj" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983785 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983809 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbh7j\" (UniqueName: \"kubernetes.io/projected/01fcd844-681c-415a-ad44-deaecdf5dc67-kube-api-access-cbh7j\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983837 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/101d8d19-546f-4af5-87a2-0e0d86b7593a-node-bootstrap-token\") pod \"machine-config-server-f7mlx\" (UID: \"101d8d19-546f-4af5-87a2-0e0d86b7593a\") " pod="openshift-machine-config-operator/machine-config-server-f7mlx" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.983945 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a5f2acf-c5af-4590-aa1e-5569d1b107bf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-h68rh\" (UID: \"6a5f2acf-c5af-4590-aa1e-5569d1b107bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984052 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4994bfc4-7e92-4877-a981-6d94b4df000a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984084 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6936ff4-89a3-4627-aa92-34a950e2ee0f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rznpq\" (UID: \"c6936ff4-89a3-4627-aa92-34a950e2ee0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rznpq" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984116 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-csi-data-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984156 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/101d8d19-546f-4af5-87a2-0e0d86b7593a-certs\") pod \"machine-config-server-f7mlx\" (UID: \"101d8d19-546f-4af5-87a2-0e0d86b7593a\") " pod="openshift-machine-config-operator/machine-config-server-f7mlx" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984238 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-audit-dir\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984282 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xq6nw\" (UniqueName: \"kubernetes.io/projected/474cf50a-fd9d-48f5-8a91-69a32f459043-kube-api-access-xq6nw\") pod \"package-server-manager-789f6589d5-zfnxn\" (UID: \"474cf50a-fd9d-48f5-8a91-69a32f459043\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984298 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-4sxh8"] Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984304 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-858fd\" (UniqueName: \"kubernetes.io/projected/101d8d19-546f-4af5-87a2-0e0d86b7593a-kube-api-access-858fd\") pod \"machine-config-server-f7mlx\" (UID: \"101d8d19-546f-4af5-87a2-0e0d86b7593a\") " pod="openshift-machine-config-operator/machine-config-server-f7mlx" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984380 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-plugins-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984408 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcjk4\" (UniqueName: \"kubernetes.io/projected/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-kube-api-access-fcjk4\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984515 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01fcd844-681c-415a-ad44-deaecdf5dc67-serving-cert\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984593 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fmpk\" (UniqueName: \"kubernetes.io/projected/c18a7bbc-d9d4-4020-b001-e2c97a70f7da-kube-api-access-6fmpk\") pod \"machine-config-operator-74547568cd-gqldd\" (UID: \"c18a7bbc-d9d4-4020-b001-e2c97a70f7da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984632 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/19b94a05-aa83-4939-828b-8715ccc15dd7-signing-cabundle\") pod \"service-ca-9c57cc56f-pjb9r\" (UID: \"19b94a05-aa83-4939-828b-8715ccc15dd7\") " pod="openshift-service-ca/service-ca-9c57cc56f-pjb9r" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984705 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-audit-policies\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.984755 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lg4z\" (UniqueName: \"kubernetes.io/projected/3e878f77-b89c-469a-8c14-0df126919fb6-kube-api-access-6lg4z\") pod \"openshift-controller-manager-operator-756b6f6bc6-n8h7d\" (UID: \"3e878f77-b89c-469a-8c14-0df126919fb6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d" Jan 27 15:49:33 crc kubenswrapper[4713]: E0127 15:49:33.991250 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:34.491207833 +0000 UTC m=+142.269417771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.993358 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4994bfc4-7e92-4877-a981-6d94b4df000a-ca-trust-extracted\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.994252 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a5f2acf-c5af-4590-aa1e-5569d1b107bf-config\") pod \"kube-controller-manager-operator-78b949d7b-h68rh\" (UID: \"6a5f2acf-c5af-4590-aa1e-5569d1b107bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh" Jan 27 15:49:33 crc kubenswrapper[4713]: I0127 15:49:33.994847 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.018174 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01fcd844-681c-415a-ad44-deaecdf5dc67-config\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.018279 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/c18a7bbc-d9d4-4020-b001-e2c97a70f7da-auth-proxy-config\") pod \"machine-config-operator-74547568cd-gqldd\" (UID: \"c18a7bbc-d9d4-4020-b001-e2c97a70f7da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.019008 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4994bfc4-7e92-4877-a981-6d94b4df000a-registry-certificates\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.019434 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-audit-dir\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.019466 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3e878f77-b89c-469a-8c14-0df126919fb6-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n8h7d\" (UID: \"3e878f77-b89c-469a-8c14-0df126919fb6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.020770 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.021422 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/c18a7bbc-d9d4-4020-b001-e2c97a70f7da-images\") pod \"machine-config-operator-74547568cd-gqldd\" (UID: \"c18a7bbc-d9d4-4020-b001-e2c97a70f7da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.021793 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/01fcd844-681c-415a-ad44-deaecdf5dc67-etcd-ca\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.022190 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ce473d77-3607-455d-b44d-f70055fb5178-config\") pod \"kube-apiserver-operator-766d6c64bb-txzhj\" (UID: \"ce473d77-3607-455d-b44d-f70055fb5178\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.022204 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4994bfc4-7e92-4877-a981-6d94b4df000a-trusted-ca\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.022785 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-audit-policies\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.025534 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5ff13383-b1fe-4905-81f2-6921bcb452fb-config\") pod \"service-ca-operator-777779d784-wbnzn\" (UID: \"5ff13383-b1fe-4905-81f2-6921bcb452fb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.031416 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/19b94a05-aa83-4939-828b-8715ccc15dd7-signing-cabundle\") pod \"service-ca-9c57cc56f-pjb9r\" (UID: \"19b94a05-aa83-4939-828b-8715ccc15dd7\") " pod="openshift-service-ca/service-ca-9c57cc56f-pjb9r" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.040077 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7vcrd"] Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.047174 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/474cf50a-fd9d-48f5-8a91-69a32f459043-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-zfnxn\" (UID: \"474cf50a-fd9d-48f5-8a91-69a32f459043\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.047797 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01fcd844-681c-415a-ad44-deaecdf5dc67-serving-cert\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.056428 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b"] Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.060746 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-encryption-config\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.061079 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5ff13383-b1fe-4905-81f2-6921bcb452fb-serving-cert\") pod \"service-ca-operator-777779d784-wbnzn\" (UID: \"5ff13383-b1fe-4905-81f2-6921bcb452fb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.063846 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4994bfc4-7e92-4877-a981-6d94b4df000a-installation-pull-secrets\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.063846 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/c6936ff4-89a3-4627-aa92-34a950e2ee0f-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-rznpq\" (UID: \"c6936ff4-89a3-4627-aa92-34a950e2ee0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rznpq" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.064127 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/19b94a05-aa83-4939-828b-8715ccc15dd7-signing-key\") pod \"service-ca-9c57cc56f-pjb9r\" (UID: \"19b94a05-aa83-4939-828b-8715ccc15dd7\") " pod="openshift-service-ca/service-ca-9c57cc56f-pjb9r" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.064172 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c18a7bbc-d9d4-4020-b001-e2c97a70f7da-proxy-tls\") pod \"machine-config-operator-74547568cd-gqldd\" (UID: \"c18a7bbc-d9d4-4020-b001-e2c97a70f7da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.064338 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3e878f77-b89c-469a-8c14-0df126919fb6-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n8h7d\" (UID: \"3e878f77-b89c-469a-8c14-0df126919fb6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.064510 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ce473d77-3607-455d-b44d-f70055fb5178-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-txzhj\" (UID: \"ce473d77-3607-455d-b44d-f70055fb5178\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.064849 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8k27x\" (UniqueName: \"kubernetes.io/projected/c6936ff4-89a3-4627-aa92-34a950e2ee0f-kube-api-access-8k27x\") pod \"control-plane-machine-set-operator-78cbb6b69f-rznpq\" (UID: \"c6936ff4-89a3-4627-aa92-34a950e2ee0f\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rznpq" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.066078 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-registry-tls\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.066610 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bfe545e4-37ec-41d0-aad6-d8a67025503d-srv-cert\") pod \"olm-operator-6b444d44fb-nkkpk\" (UID: \"bfe545e4-37ec-41d0-aad6-d8a67025503d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.066849 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll"] Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.069578 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-serving-cert\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.069695 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/01fcd844-681c-415a-ad44-deaecdf5dc67-etcd-client\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.069743 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrjsr\" (UniqueName: \"kubernetes.io/projected/19b94a05-aa83-4939-828b-8715ccc15dd7-kube-api-access-zrjsr\") pod \"service-ca-9c57cc56f-pjb9r\" (UID: \"19b94a05-aa83-4939-828b-8715ccc15dd7\") " pod="openshift-service-ca/service-ca-9c57cc56f-pjb9r" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.069788 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6a5f2acf-c5af-4590-aa1e-5569d1b107bf-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-h68rh\" (UID: \"6a5f2acf-c5af-4590-aa1e-5569d1b107bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.069793 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-etcd-client\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.069920 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbh7j\" (UniqueName: \"kubernetes.io/projected/01fcd844-681c-415a-ad44-deaecdf5dc67-kube-api-access-cbh7j\") pod \"etcd-operator-b45778765-9vt68\" (UID: \"01fcd844-681c-415a-ad44-deaecdf5dc67\") " pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.069999 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6qb6\" (UniqueName: \"kubernetes.io/projected/5ff13383-b1fe-4905-81f2-6921bcb452fb-kube-api-access-j6qb6\") pod \"service-ca-operator-777779d784-wbnzn\" (UID: \"5ff13383-b1fe-4905-81f2-6921bcb452fb\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.071841 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bfe545e4-37ec-41d0-aad6-d8a67025503d-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nkkpk\" (UID: \"bfe545e4-37ec-41d0-aad6-d8a67025503d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.074594 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-bound-sa-token\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.079192 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lg4z\" (UniqueName: \"kubernetes.io/projected/3e878f77-b89c-469a-8c14-0df126919fb6-kube-api-access-6lg4z\") pod \"openshift-controller-manager-operator-756b6f6bc6-n8h7d\" (UID: \"3e878f77-b89c-469a-8c14-0df126919fb6\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.085739 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6ee31a47-41e4-4c20-9173-48a250b99c3c-metrics-tls\") pod \"dns-default-vmvmj\" (UID: \"6ee31a47-41e4-4c20-9173-48a250b99c3c\") " pod="openshift-dns/dns-default-vmvmj" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.085796 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-registration-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.085816 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x9h7v\" (UniqueName: \"kubernetes.io/projected/475c8723-6375-4149-8cec-2e5ea00943e3-kube-api-access-x9h7v\") pod \"ingress-canary-ngmbx\" (UID: \"475c8723-6375-4149-8cec-2e5ea00943e3\") " pod="openshift-ingress-canary/ingress-canary-ngmbx" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.085854 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8mq7s\" (UniqueName: \"kubernetes.io/projected/6ee31a47-41e4-4c20-9173-48a250b99c3c-kube-api-access-8mq7s\") pod \"dns-default-vmvmj\" (UID: \"6ee31a47-41e4-4c20-9173-48a250b99c3c\") " pod="openshift-dns/dns-default-vmvmj" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.085925 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-socket-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.085947 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:34 crc kubenswrapper[4713]: W0127 15:49:34.085942 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01f5e6a0_654f_41cb_a694_2c86ca8523d8.slice/crio-83f30927c55b27e7489a560cf8890142af2550bfcabfebd744644a8ec8ad78ea WatchSource:0}: Error finding container 83f30927c55b27e7489a560cf8890142af2550bfcabfebd744644a8ec8ad78ea: Status 404 returned error can't find the container with id 83f30927c55b27e7489a560cf8890142af2550bfcabfebd744644a8ec8ad78ea Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.085964 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/475c8723-6375-4149-8cec-2e5ea00943e3-cert\") pod \"ingress-canary-ngmbx\" (UID: \"475c8723-6375-4149-8cec-2e5ea00943e3\") " pod="openshift-ingress-canary/ingress-canary-ngmbx" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.086078 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-mountpoint-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.086138 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ee31a47-41e4-4c20-9173-48a250b99c3c-config-volume\") pod \"dns-default-vmvmj\" (UID: \"6ee31a47-41e4-4c20-9173-48a250b99c3c\") " pod="openshift-dns/dns-default-vmvmj" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.086180 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/101d8d19-546f-4af5-87a2-0e0d86b7593a-node-bootstrap-token\") pod \"machine-config-server-f7mlx\" (UID: \"101d8d19-546f-4af5-87a2-0e0d86b7593a\") " pod="openshift-machine-config-operator/machine-config-server-f7mlx" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.086263 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-csi-data-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.086294 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/101d8d19-546f-4af5-87a2-0e0d86b7593a-certs\") pod \"machine-config-server-f7mlx\" (UID: \"101d8d19-546f-4af5-87a2-0e0d86b7593a\") " pod="openshift-machine-config-operator/machine-config-server-f7mlx" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.086346 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fcjk4\" (UniqueName: \"kubernetes.io/projected/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-kube-api-access-fcjk4\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.086386 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-858fd\" (UniqueName: \"kubernetes.io/projected/101d8d19-546f-4af5-87a2-0e0d86b7593a-kube-api-access-858fd\") pod \"machine-config-server-f7mlx\" (UID: \"101d8d19-546f-4af5-87a2-0e0d86b7593a\") " pod="openshift-machine-config-operator/machine-config-server-f7mlx" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.086409 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-plugins-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.086946 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-plugins-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.088564 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-socket-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.089841 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-registration-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.090315 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6ee31a47-41e4-4c20-9173-48a250b99c3c-config-volume\") pod \"dns-default-vmvmj\" (UID: \"6ee31a47-41e4-4c20-9173-48a250b99c3c\") " pod="openshift-dns/dns-default-vmvmj" Jan 27 15:49:34 crc kubenswrapper[4713]: E0127 15:49:34.091105 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:34.591089062 +0000 UTC m=+142.369299000 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.092452 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6ee31a47-41e4-4c20-9173-48a250b99c3c-metrics-tls\") pod \"dns-default-vmvmj\" (UID: \"6ee31a47-41e4-4c20-9173-48a250b99c3c\") " pod="openshift-dns/dns-default-vmvmj" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.092834 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-csi-data-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.093829 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/101d8d19-546f-4af5-87a2-0e0d86b7593a-node-bootstrap-token\") pod \"machine-config-server-f7mlx\" (UID: \"101d8d19-546f-4af5-87a2-0e0d86b7593a\") " pod="openshift-machine-config-operator/machine-config-server-f7mlx" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.096865 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-mountpoint-dir\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.101467 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/475c8723-6375-4149-8cec-2e5ea00943e3-cert\") pod \"ingress-canary-ngmbx\" (UID: \"475c8723-6375-4149-8cec-2e5ea00943e3\") " pod="openshift-ingress-canary/ingress-canary-ngmbx" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.102318 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/101d8d19-546f-4af5-87a2-0e0d86b7593a-certs\") pod \"machine-config-server-f7mlx\" (UID: \"101d8d19-546f-4af5-87a2-0e0d86b7593a\") " pod="openshift-machine-config-operator/machine-config-server-f7mlx" Jan 27 15:49:34 crc kubenswrapper[4713]: W0127 15:49:34.103621 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bb42033_3de8_4e48_a4e6_288b9fb3dc8d.slice/crio-0a8011bcb96d74949f2ef2251a509aebc51f1e9654c4ff714b21ddafd0c7fc57 WatchSource:0}: Error finding container 0a8011bcb96d74949f2ef2251a509aebc51f1e9654c4ff714b21ddafd0c7fc57: Status 404 returned error can't find the container with id 0a8011bcb96d74949f2ef2251a509aebc51f1e9654c4ff714b21ddafd0c7fc57 Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.134173 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.149753 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbct5\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-kube-api-access-hbct5\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.154618 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6a5f2acf-c5af-4590-aa1e-5569d1b107bf-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-h68rh\" (UID: \"6a5f2acf-c5af-4590-aa1e-5569d1b107bf\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.165416 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fmpk\" (UniqueName: \"kubernetes.io/projected/c18a7bbc-d9d4-4020-b001-e2c97a70f7da-kube-api-access-6fmpk\") pod \"machine-config-operator-74547568cd-gqldd\" (UID: \"c18a7bbc-d9d4-4020-b001-e2c97a70f7da\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.179835 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.188776 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.190764 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gv9m\" (UniqueName: \"kubernetes.io/projected/10eb2dde-c73b-42e0-9c94-6cab5f1a4025-kube-api-access-4gv9m\") pod \"apiserver-7bbb656c7d-j9gm5\" (UID: \"10eb2dde-c73b-42e0-9c94-6cab5f1a4025\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:34 crc kubenswrapper[4713]: E0127 15:49:34.190973 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:34.690948551 +0000 UTC m=+142.469158489 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.213597 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ce473d77-3607-455d-b44d-f70055fb5178-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-txzhj\" (UID: \"ce473d77-3607-455d-b44d-f70055fb5178\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.226810 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtl5c\" (UniqueName: \"kubernetes.io/projected/bfe545e4-37ec-41d0-aad6-d8a67025503d-kube-api-access-wtl5c\") pod \"olm-operator-6b444d44fb-nkkpk\" (UID: \"bfe545e4-37ec-41d0-aad6-d8a67025503d\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.243466 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xq6nw\" (UniqueName: \"kubernetes.io/projected/474cf50a-fd9d-48f5-8a91-69a32f459043-kube-api-access-xq6nw\") pod \"package-server-manager-789f6589d5-zfnxn\" (UID: \"474cf50a-fd9d-48f5-8a91-69a32f459043\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.258530 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.280690 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rznpq" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.309681 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:34 crc kubenswrapper[4713]: E0127 15:49:34.310323 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:34.810305619 +0000 UTC m=+142.588515557 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.315973 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x9h7v\" (UniqueName: \"kubernetes.io/projected/475c8723-6375-4149-8cec-2e5ea00943e3-kube-api-access-x9h7v\") pod \"ingress-canary-ngmbx\" (UID: \"475c8723-6375-4149-8cec-2e5ea00943e3\") " pod="openshift-ingress-canary/ingress-canary-ngmbx" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.316500 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.326277 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.339850 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-858fd\" (UniqueName: \"kubernetes.io/projected/101d8d19-546f-4af5-87a2-0e0d86b7593a-kube-api-access-858fd\") pod \"machine-config-server-f7mlx\" (UID: \"101d8d19-546f-4af5-87a2-0e0d86b7593a\") " pod="openshift-machine-config-operator/machine-config-server-f7mlx" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.341954 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-pjb9r" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.343443 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-72qz9"] Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.347156 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-jqnj8"] Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.357912 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.359310 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mq7s\" (UniqueName: \"kubernetes.io/projected/6ee31a47-41e4-4c20-9173-48a250b99c3c-kube-api-access-8mq7s\") pod \"dns-default-vmvmj\" (UID: \"6ee31a47-41e4-4c20-9173-48a250b99c3c\") " pod="openshift-dns/dns-default-vmvmj" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.380220 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcjk4\" (UniqueName: \"kubernetes.io/projected/adcd14ab-0774-4aaa-9e30-7b861f8c5c1a-kube-api-access-fcjk4\") pod \"csi-hostpathplugin-vdwzh\" (UID: \"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a\") " pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.381203 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hlk8z"] Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.395385 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.409472 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.410279 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:34 crc kubenswrapper[4713]: E0127 15:49:34.410556 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:34.910535148 +0000 UTC m=+142.688745086 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.419325 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-f7mlx" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.430123 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ngmbx" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.434969 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-vmvmj" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.445464 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w"] Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.480282 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.502554 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.514387 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:34 crc kubenswrapper[4713]: E0127 15:49:34.515992 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:35.015973197 +0000 UTC m=+142.794183135 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.615798 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:34 crc kubenswrapper[4713]: E0127 15:49:34.616274 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:35.116251448 +0000 UTC m=+142.894461386 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.691267 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" event={"ID":"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d","Type":"ContainerStarted","Data":"0a8011bcb96d74949f2ef2251a509aebc51f1e9654c4ff714b21ddafd0c7fc57"} Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.693135 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" event={"ID":"e411ddc4-0c8a-4cae-b08d-264bae41dffc","Type":"ContainerStarted","Data":"a34db0a7630ac43008491ff7661101afa510611f79d68d49816845d3fc9a831f"} Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.717108 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:34 crc kubenswrapper[4713]: E0127 15:49:34.718053 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:35.218010251 +0000 UTC m=+142.996220359 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.723297 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" event={"ID":"4c64795e-faf7-45f0-8256-2ae8191e4e07","Type":"ContainerStarted","Data":"328625ab6e1fd9edafa4a355cce7f2df57226246e4892bc53694bd7c962d8c05"} Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.733473 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" event={"ID":"b76ed054-9d3c-4679-ac70-d226ec5ec51a","Type":"ContainerStarted","Data":"864f5647099d5add6925ee6275e0e2836cdf0d1a87ae5c0b8d6fdcd66a3f40db"} Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.735957 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" event={"ID":"ad390cc5-3b01-4343-97a2-2c4385fe5142","Type":"ContainerStarted","Data":"462ac6c603c122c595919aa6ef1f9e487ac5c51c3f30a82456d2abed7f61d0e2"} Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.738018 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" event={"ID":"aae489c7-6a71-4465-9c25-7d27eb68b318","Type":"ContainerStarted","Data":"17f0f4749132245e2116a10e5b1ebf4ab41e0dd5a2f1bdc21fb1919b0738f03b"} Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.741501 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl" event={"ID":"520dd40a-8e43-4dc0-abf7-2c2b4873b1ab","Type":"ContainerStarted","Data":"3d2004ef8860cfad7ca5b4334814ef78b6e1441f188df33c2b52c49eeaa3180b"} Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.747775 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xdbd4" event={"ID":"49a98c2a-bf95-449c-8fee-d2b7ed4381db","Type":"ContainerStarted","Data":"50a3e7f6ed3156673c4528da0de14f795b90e9af596ae01d376f1f6a8f51ad5f"} Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.747830 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-xdbd4" event={"ID":"49a98c2a-bf95-449c-8fee-d2b7ed4381db","Type":"ContainerStarted","Data":"8b33caca9d75661b840dc889ab0edfd3b14f6ba5df307c1258c1956c02325251"} Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.750949 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mkwg6" event={"ID":"01f5e6a0-654f-41cb-a694-2c86ca8523d8","Type":"ContainerStarted","Data":"7e6d1e97c9d4e9de2badf45747b38b582b7f80e94a2f11900b11d6bade108be5"} Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.750978 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mkwg6" event={"ID":"01f5e6a0-654f-41cb-a694-2c86ca8523d8","Type":"ContainerStarted","Data":"83f30927c55b27e7489a560cf8890142af2550bfcabfebd744644a8ec8ad78ea"} Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.751481 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mkwg6" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.753819 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" event={"ID":"c464a7a2-0836-47e1-818d-aa8aaed4e657","Type":"ContainerStarted","Data":"2bbc318990ca8c3c6058abdca6493ec0a421c77c97b98ac6568c423b360b4e7d"} Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.754556 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.754634 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.755723 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" event={"ID":"4b935bba-4a22-4c28-b629-367f4a43bbc0","Type":"ContainerStarted","Data":"e0f9ec81f5fe42e96b1c1d93d4ad0a9dabc3ebadc18beae0f7d386afd34e3d77"} Jan 27 15:49:34 crc kubenswrapper[4713]: W0127 15:49:34.767950 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod175f6c4b_7354_47b4_b9d1_5867e001bb88.slice/crio-d94ce3e268c8e1f26f0685eab07cff29e3eff719df067b935ca7e7c02e24bb35 WatchSource:0}: Error finding container d94ce3e268c8e1f26f0685eab07cff29e3eff719df067b935ca7e7c02e24bb35: Status 404 returned error can't find the container with id d94ce3e268c8e1f26f0685eab07cff29e3eff719df067b935ca7e7c02e24bb35 Jan 27 15:49:34 crc kubenswrapper[4713]: W0127 15:49:34.780151 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ed86dba_52b7_4652_91c7_aea3c8def1fc.slice/crio-fa75e4ba9507b7ba25cb43485b2b66eef709c8c791b295ebf8670437cea23db5 WatchSource:0}: Error finding container fa75e4ba9507b7ba25cb43485b2b66eef709c8c791b295ebf8670437cea23db5: Status 404 returned error can't find the container with id fa75e4ba9507b7ba25cb43485b2b66eef709c8c791b295ebf8670437cea23db5 Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.819572 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:34 crc kubenswrapper[4713]: E0127 15:49:34.819788 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:35.319737773 +0000 UTC m=+143.097947712 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.820662 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:34 crc kubenswrapper[4713]: E0127 15:49:34.827293 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:35.327271079 +0000 UTC m=+143.105481017 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.926652 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:34 crc kubenswrapper[4713]: E0127 15:49:34.927147 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:35.427094297 +0000 UTC m=+143.205304265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:34 crc kubenswrapper[4713]: I0127 15:49:34.927433 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:34 crc kubenswrapper[4713]: E0127 15:49:34.927835 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:35.427824408 +0000 UTC m=+143.206034346 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:34.998672 4713 patch_prober.go:28] interesting pod/router-default-5444994796-xdbd4 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.006230 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xdbd4" podUID="49a98c2a-bf95-449c-8fee-d2b7ed4381db" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.031440 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.031752 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:35 crc kubenswrapper[4713]: E0127 15:49:35.032711 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:35.53267052 +0000 UTC m=+143.310880488 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.133937 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:35 crc kubenswrapper[4713]: E0127 15:49:35.134392 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:35.634369721 +0000 UTC m=+143.412579829 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.253104 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:35 crc kubenswrapper[4713]: E0127 15:49:35.254059 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:35.754022167 +0000 UTC m=+143.532232105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.355898 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:35 crc kubenswrapper[4713]: E0127 15:49:35.356786 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:35.856764258 +0000 UTC m=+143.634974266 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.430142 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95"] Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.441790 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm"] Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.457993 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:35 crc kubenswrapper[4713]: E0127 15:49:35.458922 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:35.958896683 +0000 UTC m=+143.737106621 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.482101 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx"] Jan 27 15:49:35 crc kubenswrapper[4713]: W0127 15:49:35.493572 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00b31185_ee2b_4482_99c1_f126457bd724.slice/crio-fdca9d648c299559ed832304154951324473d33d58a7969efe327433b7cc2e5f WatchSource:0}: Error finding container fdca9d648c299559ed832304154951324473d33d58a7969efe327433b7cc2e5f: Status 404 returned error can't find the container with id fdca9d648c299559ed832304154951324473d33d58a7969efe327433b7cc2e5f Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.533786 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-5cl72" podStartSLOduration=120.533760356 podStartE2EDuration="2m0.533760356s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:35.503344645 +0000 UTC m=+143.281554603" watchObservedRunningTime="2026-01-27 15:49:35.533760356 +0000 UTC m=+143.311970294" Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.534381 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-mg8jq"] Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.556915 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k"] Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.558501 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-tqc9f"] Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.559999 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5"] Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.560710 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:35 crc kubenswrapper[4713]: E0127 15:49:35.561069 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:36.061053067 +0000 UTC m=+143.839263005 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:35 crc kubenswrapper[4713]: W0127 15:49:35.592736 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86cad552_e907_4741_9709_e3952fcf470a.slice/crio-b10927a4c65e1528b280042005a4b3ece118ffffd2edab9d21688b7748df14d3 WatchSource:0}: Error finding container b10927a4c65e1528b280042005a4b3ece118ffffd2edab9d21688b7748df14d3: Status 404 returned error can't find the container with id b10927a4c65e1528b280042005a4b3ece118ffffd2edab9d21688b7748df14d3 Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.655756 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d"] Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.662378 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:35 crc kubenswrapper[4713]: E0127 15:49:35.664214 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:36.16418538 +0000 UTC m=+143.942395318 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.670377 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh"] Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.734943 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-tfz52"] Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.767391 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:35 crc kubenswrapper[4713]: E0127 15:49:35.767919 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:36.267900019 +0000 UTC m=+144.046109957 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.775887 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll" event={"ID":"ef05c38c-a0a0-40b6-9f4c-351768f9b547","Type":"ContainerStarted","Data":"edc8ef2f58568009e58892e3ff567130bbf5ce1e2812f4c9422ca815cc3a470a"} Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.775958 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll" event={"ID":"ef05c38c-a0a0-40b6-9f4c-351768f9b547","Type":"ContainerStarted","Data":"ef812991cb4f8169da86786dd7e7016b4c447a12924c9c5297537068fb6c1eff"} Jan 27 15:49:35 crc kubenswrapper[4713]: W0127 15:49:35.790487 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6a5f2acf_c5af_4590_aa1e_5569d1b107bf.slice/crio-82cdcf42bea18a38fb1e67ebaad6585842597cb5e283fd6968f9ec08e7412d58 WatchSource:0}: Error finding container 82cdcf42bea18a38fb1e67ebaad6585842597cb5e283fd6968f9ec08e7412d58: Status 404 returned error can't find the container with id 82cdcf42bea18a38fb1e67ebaad6585842597cb5e283fd6968f9ec08e7412d58 Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.791666 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" event={"ID":"ad390cc5-3b01-4343-97a2-2c4385fe5142","Type":"ContainerStarted","Data":"71c625933859e92258d9c426cd20976b48bf766ae208cea0739acb315cd88867"} Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.816769 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" event={"ID":"4b935bba-4a22-4c28-b629-367f4a43bbc0","Type":"ContainerStarted","Data":"bcb681dc0b0195eb94a8960283c835342a48994f29cb1aa64cecd7dbded3d7b8"} Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.821683 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.849496 4713 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7vcrd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.849583 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" podUID="ad390cc5-3b01-4343-97a2-2c4385fe5142" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.849699 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-xffcl"] Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.869250 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:35 crc kubenswrapper[4713]: E0127 15:49:35.869463 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:36.369426746 +0000 UTC m=+144.147636684 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.869743 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:35 crc kubenswrapper[4713]: E0127 15:49:35.872605 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:36.372584726 +0000 UTC m=+144.150794664 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.890398 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8"] Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.892841 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" event={"ID":"e114b26c-9613-4582-bfa4-d2ae0bc2e388","Type":"ContainerStarted","Data":"50601a94545884c745fa49fa3c4a34adbefb433fe14591a9e2b59318b5a156f6"} Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.916459 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" event={"ID":"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d","Type":"ContainerStarted","Data":"6265789ef3cd01c1421715ddc99bcde0b2439109a66dc5cdb6bb38dc73c49e5e"} Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.917508 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.924639 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-vmvmj"] Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.944201 4713 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-t92fx container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" start-of-body= Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.944275 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.9:6443/healthz\": dial tcp 10.217.0.9:6443: connect: connection refused" Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.944353 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk"] Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.951553 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mg8jq" event={"ID":"7c0aa49a-697a-414c-94e5-fb0594c86d8b","Type":"ContainerStarted","Data":"5a45139b31362fb9618664363921dfea2dbff9b61754f90420cf166ad55291ab"} Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.965054 4713 patch_prober.go:28] interesting pod/router-default-5444994796-xdbd4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:49:35 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Jan 27 15:49:35 crc kubenswrapper[4713]: [+]process-running ok Jan 27 15:49:35 crc kubenswrapper[4713]: healthz check failed Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.965165 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xdbd4" podUID="49a98c2a-bf95-449c-8fee-d2b7ed4381db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:49:35 crc kubenswrapper[4713]: I0127 15:49:35.973690 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:35 crc kubenswrapper[4713]: E0127 15:49:35.975299 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:36.475270956 +0000 UTC m=+144.253480904 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.037939 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w" event={"ID":"175f6c4b-7354-47b4-b9d1-5867e001bb88","Type":"ContainerStarted","Data":"8d02aeca0af5be67689252e9aa786c5cd90144253d8b60fe343b724cb8c44ee0"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.038064 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w" event={"ID":"175f6c4b-7354-47b4-b9d1-5867e001bb88","Type":"ContainerStarted","Data":"d94ce3e268c8e1f26f0685eab07cff29e3eff719df067b935ca7e7c02e24bb35"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.043820 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn"] Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.046814 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" event={"ID":"77bed8c1-f317-4117-8122-2bae18d42813","Type":"ContainerStarted","Data":"490102ed322d17a6c391678a5f25433578512ef0ef46a59fed74c8515ede501b"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.074189 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-f7mlx" event={"ID":"101d8d19-546f-4af5-87a2-0e0d86b7593a","Type":"ContainerStarted","Data":"9ca8bbeeef30a4eff599778a5525f3fafb0bde4d40b7156653e04f06abbd5557"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.074292 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-f7mlx" event={"ID":"101d8d19-546f-4af5-87a2-0e0d86b7593a","Type":"ContainerStarted","Data":"02a0de7561a7c4f14b1dc46e999c334d1f7fbad7b76ba9a38de5047e2dc84fa6"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.082519 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:36 crc kubenswrapper[4713]: E0127 15:49:36.083084 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:36.583068822 +0000 UTC m=+144.361278760 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.107982 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d" event={"ID":"3e878f77-b89c-469a-8c14-0df126919fb6","Type":"ContainerStarted","Data":"75e49481ab5f839bcc1878299bbd148189cbb4b0ac49bd6cbc925c268567ee44"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.120140 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd"] Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.125256 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-pjb9r"] Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.127647 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqc9f" event={"ID":"b27ad314-02e7-4170-8921-a531184c006d","Type":"ContainerStarted","Data":"84bcaf43a24504ed8c9788ace8c58f9b82da6882a204ed2a16dd229d86facb0f"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.132510 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-mkwg6" podStartSLOduration=121.132469997 podStartE2EDuration="2m1.132469997s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:36.112977189 +0000 UTC m=+143.891187127" watchObservedRunningTime="2026-01-27 15:49:36.132469997 +0000 UTC m=+143.910679935" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.142278 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn"] Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.143621 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-5rbsl" podStartSLOduration=121.143599235 podStartE2EDuration="2m1.143599235s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:36.141422333 +0000 UTC m=+143.919632271" watchObservedRunningTime="2026-01-27 15:49:36.143599235 +0000 UTC m=+143.921809173" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.147697 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" event={"ID":"e411ddc4-0c8a-4cae-b08d-264bae41dffc","Type":"ContainerStarted","Data":"4d049f42351d19fb3201bbfdd069a6cc723915e1a42424e1c31ff351a2f608eb"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.147737 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" event={"ID":"e411ddc4-0c8a-4cae-b08d-264bae41dffc","Type":"ContainerStarted","Data":"bf2de2a9eb58b4d79ee13ace9e2d39ddd018843114a50e50b04d21d0e305b4c1"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.161805 4713 generic.go:334] "Generic (PLEG): container finished" podID="b76ed054-9d3c-4679-ac70-d226ec5ec51a" containerID="a3bbf046cfd2092d8f8f5986a9028086e61d03f25f65771e0447d35c4d9867e5" exitCode=0 Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.162274 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" event={"ID":"b76ed054-9d3c-4679-ac70-d226ec5ec51a","Type":"ContainerDied","Data":"a3bbf046cfd2092d8f8f5986a9028086e61d03f25f65771e0447d35c4d9867e5"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.191031 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj"] Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.192357 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:36 crc kubenswrapper[4713]: E0127 15:49:36.199404 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:36.699373502 +0000 UTC m=+144.477583430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.218786 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-vdwzh"] Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.224700 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-gcqw5" podStartSLOduration=121.224679547 podStartE2EDuration="2m1.224679547s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:36.179824362 +0000 UTC m=+143.958034300" watchObservedRunningTime="2026-01-27 15:49:36.224679547 +0000 UTC m=+144.002889485" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.228131 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-xdbd4" podStartSLOduration=120.228122535 podStartE2EDuration="2m0.228122535s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:36.218642604 +0000 UTC m=+143.996852552" watchObservedRunningTime="2026-01-27 15:49:36.228122535 +0000 UTC m=+144.006332473" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.228737 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5"] Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.232208 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" event={"ID":"c464a7a2-0836-47e1-818d-aa8aaed4e657","Type":"ContainerStarted","Data":"87dcebff32e1afe0553be0e69023562e09b23c048ce8f023badcff42e5ded0e1"} Jan 27 15:49:36 crc kubenswrapper[4713]: W0127 15:49:36.249543 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ff13383_b1fe_4905_81f2_6921bcb452fb.slice/crio-c9fc390e75a7cdfa05ca643f5cebf67f7fe63ccbf9df19aa39d9362c2de200b2 WatchSource:0}: Error finding container c9fc390e75a7cdfa05ca643f5cebf67f7fe63ccbf9df19aa39d9362c2de200b2: Status 404 returned error can't find the container with id c9fc390e75a7cdfa05ca643f5cebf67f7fe63ccbf9df19aa39d9362c2de200b2 Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.280441 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ngmbx"] Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.283957 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rznpq"] Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.284015 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-9vt68"] Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.286626 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5" event={"ID":"5b77a577-edba-4a31-9885-ae11f403595a","Type":"ContainerStarted","Data":"1d10d1b300e0856ac55e01156ad3f39deba06c2d743062a8de9497d7e627d13d"} Jan 27 15:49:36 crc kubenswrapper[4713]: W0127 15:49:36.298502 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadcd14ab_0774_4aaa_9e30_7b861f8c5c1a.slice/crio-59c583f13c2d7164474b7a2609367cec2838b66d8f39a8c28d3dde8235f21f62 WatchSource:0}: Error finding container 59c583f13c2d7164474b7a2609367cec2838b66d8f39a8c28d3dde8235f21f62: Status 404 returned error can't find the container with id 59c583f13c2d7164474b7a2609367cec2838b66d8f39a8c28d3dde8235f21f62 Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.302295 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:36 crc kubenswrapper[4713]: E0127 15:49:36.305818 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:36.805767838 +0000 UTC m=+144.583977776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:36 crc kubenswrapper[4713]: W0127 15:49:36.317922 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce473d77_3607_455d_b44d_f70055fb5178.slice/crio-c12faa4b69853d73ff6e7ebc113dad9bd8b04ce1e6f257a088c055774a3490b8 WatchSource:0}: Error finding container c12faa4b69853d73ff6e7ebc113dad9bd8b04ce1e6f257a088c055774a3490b8: Status 404 returned error can't find the container with id c12faa4b69853d73ff6e7ebc113dad9bd8b04ce1e6f257a088c055774a3490b8 Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.341714 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" event={"ID":"86cad552-e907-4741-9709-e3952fcf470a","Type":"ContainerStarted","Data":"b10927a4c65e1528b280042005a4b3ece118ffffd2edab9d21688b7748df14d3"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.372388 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" event={"ID":"5ed86dba-52b7-4652-91c7-aea3c8def1fc","Type":"ContainerStarted","Data":"c0568a5dc169c1bff5c7b3062193f10d576f81dfed8150e415c3b9b92905c9e9"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.372917 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" event={"ID":"5ed86dba-52b7-4652-91c7-aea3c8def1fc","Type":"ContainerStarted","Data":"fa75e4ba9507b7ba25cb43485b2b66eef709c8c791b295ebf8670437cea23db5"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.372937 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.392836 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jqnj8" event={"ID":"7d587445-25fe-458a-8157-47d510ef09d4","Type":"ContainerStarted","Data":"f37fb0d92ce9cbba54cb1be6413821a6b22f82da4f7213fb214a40f63f4a8a3d"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.395660 4713 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hlk8z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.395743 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" podUID="5ed86dba-52b7-4652-91c7-aea3c8def1fc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Jan 27 15:49:36 crc kubenswrapper[4713]: W0127 15:49:36.396464 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6936ff4_89a3_4627_aa92_34a950e2ee0f.slice/crio-50b91b7bc5d226ef67da53a1f47e5b86a676e0bee2ac85564bc8bccbd4dd7f3a WatchSource:0}: Error finding container 50b91b7bc5d226ef67da53a1f47e5b86a676e0bee2ac85564bc8bccbd4dd7f3a: Status 404 returned error can't find the container with id 50b91b7bc5d226ef67da53a1f47e5b86a676e0bee2ac85564bc8bccbd4dd7f3a Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.406305 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.406377 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.407420 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" event={"ID":"00b31185-ee2b-4482-99c1-f126457bd724","Type":"ContainerStarted","Data":"fdca9d648c299559ed832304154951324473d33d58a7969efe327433b7cc2e5f"} Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.407466 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.407843 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:36 crc kubenswrapper[4713]: E0127 15:49:36.408958 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:36.908939572 +0000 UTC m=+144.687149510 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.409508 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.413192 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.424258 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.425365 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.431579 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" podStartSLOduration=120.431545449 podStartE2EDuration="2m0.431545449s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:36.413387849 +0000 UTC m=+144.191597787" watchObservedRunningTime="2026-01-27 15:49:36.431545449 +0000 UTC m=+144.209755387" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.435768 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-qbcpm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.435844 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" podUID="00b31185-ee2b-4482-99c1-f126457bd724" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.500382 4713 csr.go:261] certificate signing request csr-jc7wt is approved, waiting to be issued Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.509588 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:36 crc kubenswrapper[4713]: E0127 15:49:36.513402 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:37.013377442 +0000 UTC m=+144.791587570 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.513761 4713 csr.go:257] certificate signing request csr-jc7wt is issued Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.566300 4713 patch_prober.go:28] interesting pod/apiserver-76f77b778f-5cl72 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 27 15:49:36 crc kubenswrapper[4713]: [+]log ok Jan 27 15:49:36 crc kubenswrapper[4713]: [+]etcd ok Jan 27 15:49:36 crc kubenswrapper[4713]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 27 15:49:36 crc kubenswrapper[4713]: [+]poststarthook/generic-apiserver-start-informers ok Jan 27 15:49:36 crc kubenswrapper[4713]: [+]poststarthook/max-in-flight-filter ok Jan 27 15:49:36 crc kubenswrapper[4713]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 27 15:49:36 crc kubenswrapper[4713]: [+]poststarthook/image.openshift.io-apiserver-caches ok Jan 27 15:49:36 crc kubenswrapper[4713]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Jan 27 15:49:36 crc kubenswrapper[4713]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Jan 27 15:49:36 crc kubenswrapper[4713]: [+]poststarthook/project.openshift.io-projectcache ok Jan 27 15:49:36 crc kubenswrapper[4713]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Jan 27 15:49:36 crc kubenswrapper[4713]: [+]poststarthook/openshift.io-startinformers ok Jan 27 15:49:36 crc kubenswrapper[4713]: [+]poststarthook/openshift.io-restmapperupdater ok Jan 27 15:49:36 crc kubenswrapper[4713]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 27 15:49:36 crc kubenswrapper[4713]: livez check failed Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.566365 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-5cl72" podUID="f0905e52-1e55-4a6b-887a-6680bd4d2004" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.613083 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:36 crc kubenswrapper[4713]: E0127 15:49:36.613444 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:37.113426876 +0000 UTC m=+144.891636814 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.648873 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-f7mlx" podStartSLOduration=5.64884577 podStartE2EDuration="5.64884577s" podCreationTimestamp="2026-01-27 15:49:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:36.612251773 +0000 UTC m=+144.390461711" watchObservedRunningTime="2026-01-27 15:49:36.64884577 +0000 UTC m=+144.427055708" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.692172 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-v744w" podStartSLOduration=120.69215027 podStartE2EDuration="2m0.69215027s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:36.689478474 +0000 UTC m=+144.467688412" watchObservedRunningTime="2026-01-27 15:49:36.69215027 +0000 UTC m=+144.470360208" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.714468 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:36 crc kubenswrapper[4713]: E0127 15:49:36.714910 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:37.214895632 +0000 UTC m=+144.993105560 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.727683 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-4sxh8" podStartSLOduration=120.727661297 podStartE2EDuration="2m0.727661297s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:36.726812783 +0000 UTC m=+144.505022721" watchObservedRunningTime="2026-01-27 15:49:36.727661297 +0000 UTC m=+144.505871235" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.847580 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:36 crc kubenswrapper[4713]: E0127 15:49:36.857672 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:37.357603447 +0000 UTC m=+145.135813385 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.888168 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" podStartSLOduration=121.888133941 podStartE2EDuration="2m1.888133941s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:36.821894285 +0000 UTC m=+144.600104243" watchObservedRunningTime="2026-01-27 15:49:36.888133941 +0000 UTC m=+144.666343879" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.951457 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:36 crc kubenswrapper[4713]: E0127 15:49:36.951944 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:37.451929488 +0000 UTC m=+145.230139426 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.952350 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-h5w6b" podStartSLOduration=120.952328149 podStartE2EDuration="2m0.952328149s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:36.950730693 +0000 UTC m=+144.728940651" watchObservedRunningTime="2026-01-27 15:49:36.952328149 +0000 UTC m=+144.730538087" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.952865 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" podStartSLOduration=120.952857754 podStartE2EDuration="2m0.952857754s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:36.886521445 +0000 UTC m=+144.664731393" watchObservedRunningTime="2026-01-27 15:49:36.952857754 +0000 UTC m=+144.731067692" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.961983 4713 patch_prober.go:28] interesting pod/router-default-5444994796-xdbd4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:49:36 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Jan 27 15:49:36 crc kubenswrapper[4713]: [+]process-running ok Jan 27 15:49:36 crc kubenswrapper[4713]: healthz check failed Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.962066 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xdbd4" podUID="49a98c2a-bf95-449c-8fee-d2b7ed4381db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:49:36 crc kubenswrapper[4713]: I0127 15:49:36.984557 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-wrjvf" podStartSLOduration=121.984530771 podStartE2EDuration="2m1.984530771s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:36.983336757 +0000 UTC m=+144.761546705" watchObservedRunningTime="2026-01-27 15:49:36.984530771 +0000 UTC m=+144.762740709" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.021273 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" podStartSLOduration=121.021239502 podStartE2EDuration="2m1.021239502s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:37.019720759 +0000 UTC m=+144.797930707" watchObservedRunningTime="2026-01-27 15:49:37.021239502 +0000 UTC m=+144.799449430" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.041563 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" podStartSLOduration=121.041535693 podStartE2EDuration="2m1.041535693s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:37.040527114 +0000 UTC m=+144.818737062" watchObservedRunningTime="2026-01-27 15:49:37.041535693 +0000 UTC m=+144.819745631" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.052802 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:37 crc kubenswrapper[4713]: E0127 15:49:37.053714 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:37.553675441 +0000 UTC m=+145.331885379 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.158837 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:37 crc kubenswrapper[4713]: E0127 15:49:37.159683 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:37.659665365 +0000 UTC m=+145.437875303 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.260845 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:37 crc kubenswrapper[4713]: E0127 15:49:37.261587 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:37.761566383 +0000 UTC m=+145.539776321 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.363330 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:37 crc kubenswrapper[4713]: E0127 15:49:37.363822 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:37.863802129 +0000 UTC m=+145.642012067 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.419697 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn" event={"ID":"474cf50a-fd9d-48f5-8a91-69a32f459043","Type":"ContainerStarted","Data":"06a4f5a0dee710d2bde4fb467af8855518ef37ff2327f918ceed6df2bf72bdd1"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.439938 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" event={"ID":"77bed8c1-f317-4117-8122-2bae18d42813","Type":"ContainerStarted","Data":"3a867179bc5b7f5ea82182ac9a4741e16b8bad739a2f8d97705adb6a8f3f39b7"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.440742 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.445334 4713 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-vkk8k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.445404 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" podUID="77bed8c1-f317-4117-8122-2bae18d42813" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.446487 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh" event={"ID":"6a5f2acf-c5af-4590-aa1e-5569d1b107bf","Type":"ContainerStarted","Data":"02e0d9acd3572229abae7dd60278f193002fc31319122a558d99979622e8c57f"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.446530 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh" event={"ID":"6a5f2acf-c5af-4590-aa1e-5569d1b107bf","Type":"ContainerStarted","Data":"82cdcf42bea18a38fb1e67ebaad6585842597cb5e283fd6968f9ec08e7412d58"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.450833 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vmvmj" event={"ID":"6ee31a47-41e4-4c20-9173-48a250b99c3c","Type":"ContainerStarted","Data":"d5862f08c92ad0436f095f07de4a8d3c537c30534e58d34e6951c679791e1873"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.459169 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn" event={"ID":"5ff13383-b1fe-4905-81f2-6921bcb452fb","Type":"ContainerStarted","Data":"bd39f98f6a4dec89ac0f95858cbd9350f195ad7d662f5478a9bc749698b0f131"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.459235 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn" event={"ID":"5ff13383-b1fe-4905-81f2-6921bcb452fb","Type":"ContainerStarted","Data":"c9fc390e75a7cdfa05ca643f5cebf67f7fe63ccbf9df19aa39d9362c2de200b2"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.462773 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mg8jq" event={"ID":"7c0aa49a-697a-414c-94e5-fb0594c86d8b","Type":"ContainerStarted","Data":"a8c8fc9df920e3a75267c56a3bbb3425d7ad6cf26ddc4194eb3decf3571fdcca"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.464437 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:37 crc kubenswrapper[4713]: E0127 15:49:37.464858 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:37.964827581 +0000 UTC m=+145.743037559 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.465211 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:37 crc kubenswrapper[4713]: E0127 15:49:37.465702 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:37.965686916 +0000 UTC m=+145.743896854 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.470496 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" podStartSLOduration=121.470467752 podStartE2EDuration="2m1.470467752s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:37.469641189 +0000 UTC m=+145.247851137" watchObservedRunningTime="2026-01-27 15:49:37.470467752 +0000 UTC m=+145.248677690" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.477399 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll" event={"ID":"ef05c38c-a0a0-40b6-9f4c-351768f9b547","Type":"ContainerStarted","Data":"f091b3f0c4a8af9bfd17fcfec75bd71b32ee8e60718dbca579127e98cde2ef5a"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.486535 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" event={"ID":"15eba547-a66e-48be-9d4e-fb4262515042","Type":"ContainerStarted","Data":"9538696360acc2a6f8a838877f9e9e86d214e08062ca510319d4d3f6fa174536"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.486978 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" event={"ID":"15eba547-a66e-48be-9d4e-fb4262515042","Type":"ContainerStarted","Data":"62ef0dd4c41316a652ae01641e0b71ad3c834e6c6247ae2e9ebbb8dccc4946d0"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.496213 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" event={"ID":"01fcd844-681c-415a-ad44-deaecdf5dc67","Type":"ContainerStarted","Data":"a9d59e8c8b8d043f7e76f4be036da70576621c2029ec7510999531648899fcaf"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.502891 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-wbnzn" podStartSLOduration=121.50286222 podStartE2EDuration="2m1.50286222s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:37.495148189 +0000 UTC m=+145.273358147" watchObservedRunningTime="2026-01-27 15:49:37.50286222 +0000 UTC m=+145.281072158" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.504062 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ngmbx" event={"ID":"475c8723-6375-4149-8cec-2e5ea00943e3","Type":"ContainerStarted","Data":"180f63bfc517d293abdd091f7c7ae595a4453f2207b18760628d6e3a12f65fa2"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.515395 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-27 15:44:36 +0000 UTC, rotation deadline is 2026-11-19 19:38:33.215592927 +0000 UTC Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.515443 4713 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7107h48m55.700152847s for next certificate rotation Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.518387 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5" event={"ID":"5b77a577-edba-4a31-9885-ae11f403595a","Type":"ContainerStarted","Data":"8a1c4024d5dad836aa2886038f92d84e823b8ea72c429479a36c506ff3b23dd5"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.533301 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tfz52" event={"ID":"f57fea35-d5a7-43ca-843f-c8b81ca8bf69","Type":"ContainerStarted","Data":"0e1983a0fb4972c073c06e81eab8804d091954cdb7c9326adb9cce468cc9c29a"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.533358 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-tfz52" event={"ID":"f57fea35-d5a7-43ca-843f-c8b81ca8bf69","Type":"ContainerStarted","Data":"ae786784c1179b04923d404aab5242e228a2349664f37425326ff2d3e358f219"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.533905 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.539109 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-tfz52 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.539176 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tfz52" podUID="f57fea35-d5a7-43ca-843f-c8b81ca8bf69" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.541667 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" event={"ID":"bfe545e4-37ec-41d0-aad6-d8a67025503d","Type":"ContainerStarted","Data":"15e362b30d8fb544b2582a62e330f0e315342799a302518e263d8512d559aac5"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.541729 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" event={"ID":"bfe545e4-37ec-41d0-aad6-d8a67025503d","Type":"ContainerStarted","Data":"4679a0d4ea50234fa93e9df57c3340732bf91efbd0479bbbec361ae7fb217745"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.544567 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.549346 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-nkkpk container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.549476 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" podUID="bfe545e4-37ec-41d0-aad6-d8a67025503d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.556338 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" event={"ID":"e114b26c-9613-4582-bfa4-d2ae0bc2e388","Type":"ContainerStarted","Data":"75ac4dc1d93206f86e15721da81452c5c4a1087f461d83a7c49cda79b679f4e5"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.558051 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zppll" podStartSLOduration=122.558007839 podStartE2EDuration="2m2.558007839s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:37.528162104 +0000 UTC m=+145.306372072" watchObservedRunningTime="2026-01-27 15:49:37.558007839 +0000 UTC m=+145.336217767" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.561618 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-8jtx5" podStartSLOduration=121.561602312 podStartE2EDuration="2m1.561602312s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:37.55595418 +0000 UTC m=+145.334164118" watchObservedRunningTime="2026-01-27 15:49:37.561602312 +0000 UTC m=+145.339812250" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.560895 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-pjb9r" event={"ID":"19b94a05-aa83-4939-828b-8715ccc15dd7","Type":"ContainerStarted","Data":"2aa97b3aee325ecf6e5b2a3f69fe8f38c71a3d766b08d5cb248d85bc373b60ca"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.566982 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:37 crc kubenswrapper[4713]: E0127 15:49:37.569091 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:38.069069285 +0000 UTC m=+145.847279223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.582100 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rznpq" event={"ID":"c6936ff4-89a3-4627-aa92-34a950e2ee0f","Type":"ContainerStarted","Data":"50b91b7bc5d226ef67da53a1f47e5b86a676e0bee2ac85564bc8bccbd4dd7f3a"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.595994 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xffcl" event={"ID":"a38fa71e-fb6f-4742-9291-0038d946cfec","Type":"ContainerStarted","Data":"a4c570c815cd1bb5e8282cacf8e22e4fa04eb97b5492994dfc0bd94da1710e03"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.596361 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-xffcl" event={"ID":"a38fa71e-fb6f-4742-9291-0038d946cfec","Type":"ContainerStarted","Data":"369c660001d1e5341f3e79d26ae25c7bd034127d36dcd06994d753b230da5dcb"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.601474 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" event={"ID":"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a","Type":"ContainerStarted","Data":"59c583f13c2d7164474b7a2609367cec2838b66d8f39a8c28d3dde8235f21f62"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.610905 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jqnj8" event={"ID":"7d587445-25fe-458a-8157-47d510ef09d4","Type":"ContainerStarted","Data":"cfed0a38f49fad7d8c94d581c669c132ab26677dff1dfec2dbccdcbe205f62a7"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.614114 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" event={"ID":"86cad552-e907-4741-9709-e3952fcf470a","Type":"ContainerStarted","Data":"13b2107423485d22474ff7743ac408e9012f1985ca272a898da0fc322472eee6"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.617218 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-tfz52" podStartSLOduration=122.617193813 podStartE2EDuration="2m2.617193813s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:37.590603762 +0000 UTC m=+145.368813710" watchObservedRunningTime="2026-01-27 15:49:37.617193813 +0000 UTC m=+145.395403751" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.618748 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" podStartSLOduration=121.618742367 podStartE2EDuration="2m1.618742367s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:37.616307588 +0000 UTC m=+145.394517536" watchObservedRunningTime="2026-01-27 15:49:37.618742367 +0000 UTC m=+145.396952295" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.623959 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" event={"ID":"c18a7bbc-d9d4-4020-b001-e2c97a70f7da","Type":"ContainerStarted","Data":"4d950066b8650e7491b139daf4d82d8a939c0106704b44a3fd7a2a521f134d71"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.626561 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqc9f" event={"ID":"b27ad314-02e7-4170-8921-a531184c006d","Type":"ContainerStarted","Data":"76bef3b1f3a2eb1326edf2142ad6b5ac373447a923e697cb387aad8f933a8f8d"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.626611 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqc9f" event={"ID":"b27ad314-02e7-4170-8921-a531184c006d","Type":"ContainerStarted","Data":"898743be2505389906b70aea94c216cc903c8d69cdb63c81ee5be337864f573f"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.641635 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-xffcl" podStartSLOduration=122.641595372 podStartE2EDuration="2m2.641595372s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:37.638554815 +0000 UTC m=+145.416764763" watchObservedRunningTime="2026-01-27 15:49:37.641595372 +0000 UTC m=+145.419805310" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.643921 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" event={"ID":"b76ed054-9d3c-4679-ac70-d226ec5ec51a","Type":"ContainerStarted","Data":"4ae852208fcdb950ef6d0b90cb0e8bef294a201e24e9d4780e391ebf78556488"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.645661 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj" event={"ID":"ce473d77-3607-455d-b44d-f70055fb5178","Type":"ContainerStarted","Data":"c12faa4b69853d73ff6e7ebc113dad9bd8b04ce1e6f257a088c055774a3490b8"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.653233 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d" event={"ID":"3e878f77-b89c-469a-8c14-0df126919fb6","Type":"ContainerStarted","Data":"c2aa5c675d90f7f94d92cf320ab24bcba5c3267924f33bb4271615c4c7e48063"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.661382 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" event={"ID":"00b31185-ee2b-4482-99c1-f126457bd724","Type":"ContainerStarted","Data":"302e1d7672c4c75b35ae350e1d7e64bbe9e08dcfdffd554a11ceca904110256e"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.661472 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" podStartSLOduration=122.66144479 podStartE2EDuration="2m2.66144479s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:37.660608316 +0000 UTC m=+145.438818254" watchObservedRunningTime="2026-01-27 15:49:37.66144479 +0000 UTC m=+145.439654728" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.662384 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-qbcpm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.662486 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" podUID="00b31185-ee2b-4482-99c1-f126457bd724" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.665205 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" event={"ID":"10eb2dde-c73b-42e0-9c94-6cab5f1a4025","Type":"ContainerStarted","Data":"2676b4a969094bb78877bf4594b3e4d325f193ea5f837840cdb4ab955297a27d"} Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.667012 4713 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-7vcrd container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.667117 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" podUID="ad390cc5-3b01-4343-97a2-2c4385fe5142" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.12:8443/healthz\": dial tcp 10.217.0.12:8443: connect: connection refused" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.669498 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:37 crc kubenswrapper[4713]: E0127 15:49:37.671582 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:38.171463057 +0000 UTC m=+145.949672995 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.673101 4713 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-hlk8z container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" start-of-body= Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.673180 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" podUID="5ed86dba-52b7-4652-91c7-aea3c8def1fc" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.34:8080/healthz\": dial tcp 10.217.0.34:8080: connect: connection refused" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.684681 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.684604 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-tqc9f" podStartSLOduration=121.684570052 podStartE2EDuration="2m1.684570052s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:37.676405268 +0000 UTC m=+145.454615206" watchObservedRunningTime="2026-01-27 15:49:37.684570052 +0000 UTC m=+145.462780010" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.754357 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n8h7d" podStartSLOduration=122.754332289 podStartE2EDuration="2m2.754332289s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:37.720551202 +0000 UTC m=+145.498761140" watchObservedRunningTime="2026-01-27 15:49:37.754332289 +0000 UTC m=+145.532542227" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.777805 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:37 crc kubenswrapper[4713]: E0127 15:49:37.780672 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:38.280645163 +0000 UTC m=+146.058855111 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.880859 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:37 crc kubenswrapper[4713]: E0127 15:49:37.881502 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:38.381473579 +0000 UTC m=+146.159683667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.960058 4713 patch_prober.go:28] interesting pod/router-default-5444994796-xdbd4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:49:37 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Jan 27 15:49:37 crc kubenswrapper[4713]: [+]process-running ok Jan 27 15:49:37 crc kubenswrapper[4713]: healthz check failed Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.960139 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xdbd4" podUID="49a98c2a-bf95-449c-8fee-d2b7ed4381db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:49:37 crc kubenswrapper[4713]: I0127 15:49:37.982213 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:37 crc kubenswrapper[4713]: E0127 15:49:37.982730 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:38.482708228 +0000 UTC m=+146.260918166 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.083860 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:38 crc kubenswrapper[4713]: E0127 15:49:38.084338 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:38.584320927 +0000 UTC m=+146.362530865 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.185295 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:38 crc kubenswrapper[4713]: E0127 15:49:38.185523 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:38.685501174 +0000 UTC m=+146.463711112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.185683 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:38 crc kubenswrapper[4713]: E0127 15:49:38.186026 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:38.686019218 +0000 UTC m=+146.464229156 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.286863 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:38 crc kubenswrapper[4713]: E0127 15:49:38.287414 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:38.787389821 +0000 UTC m=+146.565599759 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.389050 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:38 crc kubenswrapper[4713]: E0127 15:49:38.389920 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:38.889901446 +0000 UTC m=+146.668111384 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.490895 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:38 crc kubenswrapper[4713]: E0127 15:49:38.491521 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:38.991497304 +0000 UTC m=+146.769707242 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.594072 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:38 crc kubenswrapper[4713]: E0127 15:49:38.594678 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.094653168 +0000 UTC m=+146.872863106 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.691833 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rznpq" event={"ID":"c6936ff4-89a3-4627-aa92-34a950e2ee0f","Type":"ContainerStarted","Data":"9af2b59c1b5b0bb7c5acac91148ac6a315e69a031d5b6fefcc2d1981edd65d1e"} Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.696477 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:38 crc kubenswrapper[4713]: E0127 15:49:38.696936 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.196912085 +0000 UTC m=+146.975122023 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.698892 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-mg8jq" event={"ID":"7c0aa49a-697a-414c-94e5-fb0594c86d8b","Type":"ContainerStarted","Data":"9507b8eae6a8921aaf09c8f9077ce0304d0f82bd31eb80c4dbe6bd96d2e5ca85"} Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.717963 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-rznpq" podStartSLOduration=122.717935447 podStartE2EDuration="2m2.717935447s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:38.715114336 +0000 UTC m=+146.493324274" watchObservedRunningTime="2026-01-27 15:49:38.717935447 +0000 UTC m=+146.496145385" Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.734688 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" event={"ID":"e114b26c-9613-4582-bfa4-d2ae0bc2e388","Type":"ContainerStarted","Data":"d70dca615845d28840a15bf4bab60ec0a44aff85d46caff658b1ad840e286a37"} Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.760209 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" event={"ID":"15eba547-a66e-48be-9d4e-fb4262515042","Type":"ContainerStarted","Data":"cc9a53f5c975b4735df2e1ffd48931a79bd4535ec65fea72b13090145e587533"} Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.772315 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" event={"ID":"01fcd844-681c-415a-ad44-deaecdf5dc67","Type":"ContainerStarted","Data":"ee1c6789a38def56dc0e34dec3ed1649daa332fd79cd4be07e1e30aba8c0337d"} Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.794190 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-jqnj8" event={"ID":"7d587445-25fe-458a-8157-47d510ef09d4","Type":"ContainerStarted","Data":"efb587491ebbc8323ae6dd18a1ad1bc485844c7def9690d957e7a99c2104d909"} Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.798072 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:38 crc kubenswrapper[4713]: E0127 15:49:38.800746 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.300719267 +0000 UTC m=+147.078929275 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.805331 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-pjb9r" event={"ID":"19b94a05-aa83-4939-828b-8715ccc15dd7","Type":"ContainerStarted","Data":"0ac387cb6aff3dccecf19842fa9e94b282914c462c1c3100c94112c5b3f5ed88"} Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.809083 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-kqg95" podStartSLOduration=122.809053086 podStartE2EDuration="2m2.809053086s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:38.80570483 +0000 UTC m=+146.583914778" watchObservedRunningTime="2026-01-27 15:49:38.809053086 +0000 UTC m=+146.587263024" Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.809235 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-mg8jq" podStartSLOduration=122.809228311 podStartE2EDuration="2m2.809228311s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:38.772893121 +0000 UTC m=+146.551103069" watchObservedRunningTime="2026-01-27 15:49:38.809228311 +0000 UTC m=+146.587438259" Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.838462 4713 generic.go:334] "Generic (PLEG): container finished" podID="10eb2dde-c73b-42e0-9c94-6cab5f1a4025" containerID="2ee529d7b5cb6acbd91c5ed788c14e520ed7bb7f5de5bd8701207ad2665dc751" exitCode=0 Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.839434 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" event={"ID":"10eb2dde-c73b-42e0-9c94-6cab5f1a4025","Type":"ContainerDied","Data":"2ee529d7b5cb6acbd91c5ed788c14e520ed7bb7f5de5bd8701207ad2665dc751"} Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.871457 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn" event={"ID":"474cf50a-fd9d-48f5-8a91-69a32f459043","Type":"ContainerStarted","Data":"5102d7bbda82aeff21d212ca706edec16597131dafa46ef5d8f8fffe127911ab"} Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.871523 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn" event={"ID":"474cf50a-fd9d-48f5-8a91-69a32f459043","Type":"ContainerStarted","Data":"d4a938a97738c4eba9b80e97005f8ba3e982a6c22e9ac4badae1031e85395333"} Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.872280 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn" Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.895450 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-tgdd8" podStartSLOduration=122.895420609 podStartE2EDuration="2m2.895420609s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:38.85077596 +0000 UTC m=+146.628985908" watchObservedRunningTime="2026-01-27 15:49:38.895420609 +0000 UTC m=+146.673630547" Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.902736 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:38 crc kubenswrapper[4713]: E0127 15:49:38.903851 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.403831649 +0000 UTC m=+147.182041587 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.989272 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn" podStartSLOduration=122.989235195 podStartE2EDuration="2m2.989235195s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:38.977874599 +0000 UTC m=+146.756084547" watchObservedRunningTime="2026-01-27 15:49:38.989235195 +0000 UTC m=+146.767445133" Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.996615 4713 patch_prober.go:28] interesting pod/router-default-5444994796-xdbd4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:49:38 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Jan 27 15:49:38 crc kubenswrapper[4713]: [+]process-running ok Jan 27 15:49:38 crc kubenswrapper[4713]: healthz check failed Jan 27 15:49:38 crc kubenswrapper[4713]: I0127 15:49:38.996695 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xdbd4" podUID="49a98c2a-bf95-449c-8fee-d2b7ed4381db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:38.999490 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj" event={"ID":"ce473d77-3607-455d-b44d-f70055fb5178","Type":"ContainerStarted","Data":"8006c67eadaafab21eb5759b9ecb319c05b30d404005f40ce367f52ce1aaf24d"} Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:38.999550 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" event={"ID":"c18a7bbc-d9d4-4020-b001-e2c97a70f7da","Type":"ContainerStarted","Data":"4745421609f7c0264718f00250ff569b1b19c232da1d58a9b4d339084f815443"} Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:38.999563 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" event={"ID":"c18a7bbc-d9d4-4020-b001-e2c97a70f7da","Type":"ContainerStarted","Data":"118188fa049de8cf003613488cdcb5ea4e0c98c5f6ccee8e233fe611d268b8de"} Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.005851 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-9vt68" podStartSLOduration=123.005812819 podStartE2EDuration="2m3.005812819s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:38.896480019 +0000 UTC m=+146.674689967" watchObservedRunningTime="2026-01-27 15:49:39.005812819 +0000 UTC m=+146.784022757" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.006169 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.009429 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.509413182 +0000 UTC m=+147.287623120 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.027298 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vmvmj" event={"ID":"6ee31a47-41e4-4c20-9173-48a250b99c3c","Type":"ContainerStarted","Data":"1a6d545bc13a5a7cceb6b8a76539b6ae0a869656b7aa611b22b9e3b923a3adfd"} Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.036892 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-vmvmj" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.050340 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ngmbx" event={"ID":"475c8723-6375-4149-8cec-2e5ea00943e3","Type":"ContainerStarted","Data":"d1a59ae1fe12c4bd75e01c3f0b3f9cc53c880594161721d8b366fd15f50d45a8"} Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.051860 4713 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-vkk8k container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" start-of-body= Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.051930 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" podUID="77bed8c1-f317-4117-8122-2bae18d42813" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.33:8443/healthz\": dial tcp 10.217.0.33:8443: connect: connection refused" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.052053 4713 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-nkkpk container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" start-of-body= Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.052100 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" podUID="bfe545e4-37ec-41d0-aad6-d8a67025503d" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.21:8443/healthz\": dial tcp 10.217.0.21:8443: connect: connection refused" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.054541 4713 patch_prober.go:28] interesting pod/console-operator-58897d9998-tfz52 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" start-of-body= Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.054581 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-tfz52" podUID="f57fea35-d5a7-43ca-843f-c8b81ca8bf69" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.11:8443/readyz\": dial tcp 10.217.0.11:8443: connect: connection refused" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.057945 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-jqnj8" podStartSLOduration=123.057925731 podStartE2EDuration="2m3.057925731s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:39.056413498 +0000 UTC m=+146.834623436" watchObservedRunningTime="2026-01-27 15:49:39.057925731 +0000 UTC m=+146.836135669" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.063050 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.077131 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-pjb9r" podStartSLOduration=123.07710661 podStartE2EDuration="2m3.07710661s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:39.075773432 +0000 UTC m=+146.853983370" watchObservedRunningTime="2026-01-27 15:49:39.07710661 +0000 UTC m=+146.855316548" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.123707 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.125561 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.625544657 +0000 UTC m=+147.403754595 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.155269 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ngmbx" podStartSLOduration=8.155241787 podStartE2EDuration="8.155241787s" podCreationTimestamp="2026-01-27 15:49:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:39.123057126 +0000 UTC m=+146.901267064" watchObservedRunningTime="2026-01-27 15:49:39.155241787 +0000 UTC m=+146.933451725" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.203250 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-vmvmj" podStartSLOduration=8.203226251 podStartE2EDuration="8.203226251s" podCreationTimestamp="2026-01-27 15:49:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:39.156565015 +0000 UTC m=+146.934774963" watchObservedRunningTime="2026-01-27 15:49:39.203226251 +0000 UTC m=+146.981436189" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.232285 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-h68rh" podStartSLOduration=123.232259602 podStartE2EDuration="2m3.232259602s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:39.231727867 +0000 UTC m=+147.009937815" watchObservedRunningTime="2026-01-27 15:49:39.232259602 +0000 UTC m=+147.010469540" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.233253 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-gqldd" podStartSLOduration=123.233248261 podStartE2EDuration="2m3.233248261s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:39.203668544 +0000 UTC m=+146.981878472" watchObservedRunningTime="2026-01-27 15:49:39.233248261 +0000 UTC m=+147.011458199" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.233370 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.233809 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.733791426 +0000 UTC m=+147.512001364 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.318818 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" podStartSLOduration=124.31878952 podStartE2EDuration="2m4.31878952s" podCreationTimestamp="2026-01-27 15:47:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:39.28142543 +0000 UTC m=+147.059635358" watchObservedRunningTime="2026-01-27 15:49:39.31878952 +0000 UTC m=+147.096999458" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.334988 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.335179 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.835143378 +0000 UTC m=+147.613353316 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.335413 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.335992 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.835971271 +0000 UTC m=+147.614181209 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.436709 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.437290 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.937250451 +0000 UTC m=+147.715460419 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.437375 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.438156 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:39.938144567 +0000 UTC m=+147.716354505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.539592 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.539846 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.039792637 +0000 UTC m=+147.818002575 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.539957 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.540644 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.040625221 +0000 UTC m=+147.818835159 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.641868 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.642105 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.142064765 +0000 UTC m=+147.920274703 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.642313 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.642816 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.142803196 +0000 UTC m=+147.921013304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.695440 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.743839 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.744091 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.244032834 +0000 UTC m=+148.022242782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.744185 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.744687 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.244675563 +0000 UTC m=+148.022885501 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.845153 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.845304 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.345282583 +0000 UTC m=+148.123492521 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.845744 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.845773 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.845805 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.845859 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.845903 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.846410 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.346385955 +0000 UTC m=+148.124595883 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.848226 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.855682 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.857781 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.858939 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.947341 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.947595 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.447556051 +0000 UTC m=+148.225765989 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.948595 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:39 crc kubenswrapper[4713]: E0127 15:49:39.949009 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.448997152 +0000 UTC m=+148.227207090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.961438 4713 patch_prober.go:28] interesting pod/router-default-5444994796-xdbd4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:49:39 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Jan 27 15:49:39 crc kubenswrapper[4713]: [+]process-running ok Jan 27 15:49:39 crc kubenswrapper[4713]: healthz check failed Jan 27 15:49:39 crc kubenswrapper[4713]: I0127 15:49:39.961516 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xdbd4" podUID="49a98c2a-bf95-449c-8fee-d2b7ed4381db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.050138 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:40 crc kubenswrapper[4713]: E0127 15:49:40.050373 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.550324813 +0000 UTC m=+148.328534761 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.050529 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:40 crc kubenswrapper[4713]: E0127 15:49:40.050981 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.550968982 +0000 UTC m=+148.329178920 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.052782 4713 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-qbcpm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.052863 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" podUID="00b31185-ee2b-4482-99c1-f126457bd724" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.055089 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-vmvmj" event={"ID":"6ee31a47-41e4-4c20-9173-48a250b99c3c","Type":"ContainerStarted","Data":"4e3b360b3907b0d8c101c9a7b58280189458f68664c974fb46b5a0234a72ccc3"} Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.056734 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" event={"ID":"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a","Type":"ContainerStarted","Data":"cd09f4a872c4262d809a3135eb2f4cccd0d14497f19c8529508a1c71e1a3da9f"} Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.059954 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" event={"ID":"10eb2dde-c73b-42e0-9c94-6cab5f1a4025","Type":"ContainerStarted","Data":"94219c8cad6d5a7e90dc3970fe0c10f69dd56088dc5693831c095dd86ce9758b"} Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.072608 4713 patch_prober.go:28] interesting pod/openshift-config-operator-7777fb866f-72qz9 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" start-of-body= Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.072769 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" podUID="b76ed054-9d3c-4679-ac70-d226ec5ec51a" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.217.0.22:8443/healthz\": dial tcp 10.217.0.22:8443: connect: connection refused" Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.089236 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkkpk" Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.113545 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.124541 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.129881 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.153276 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:40 crc kubenswrapper[4713]: E0127 15:49:40.157221 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.657190043 +0000 UTC m=+148.435399981 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.167632 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-txzhj" podStartSLOduration=124.167608841 podStartE2EDuration="2m4.167608841s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:39.366368172 +0000 UTC m=+147.144578110" watchObservedRunningTime="2026-01-27 15:49:40.167608841 +0000 UTC m=+147.945818779" Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.217602 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" podStartSLOduration=124.217573212 podStartE2EDuration="2m4.217573212s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:40.173274903 +0000 UTC m=+147.951484851" watchObservedRunningTime="2026-01-27 15:49:40.217573212 +0000 UTC m=+147.995783150" Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.264010 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:40 crc kubenswrapper[4713]: E0127 15:49:40.264540 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.764522716 +0000 UTC m=+148.542732654 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.367696 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:40 crc kubenswrapper[4713]: E0127 15:49:40.368107 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.868086271 +0000 UTC m=+148.646296209 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.469994 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:40 crc kubenswrapper[4713]: E0127 15:49:40.470676 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:40.970647327 +0000 UTC m=+148.748857435 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.571837 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:40 crc kubenswrapper[4713]: E0127 15:49:40.572147 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:41.072094991 +0000 UTC m=+148.850304949 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.572647 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:40 crc kubenswrapper[4713]: E0127 15:49:40.573040 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:41.073023478 +0000 UTC m=+148.851233416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.675575 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:40 crc kubenswrapper[4713]: E0127 15:49:40.675946 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:41.175927494 +0000 UTC m=+148.954137432 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.781212 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:40 crc kubenswrapper[4713]: E0127 15:49:40.781688 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:41.281670322 +0000 UTC m=+149.059880260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.884882 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:40 crc kubenswrapper[4713]: E0127 15:49:40.885551 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:41.385532574 +0000 UTC m=+149.163742512 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.962993 4713 patch_prober.go:28] interesting pod/router-default-5444994796-xdbd4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:49:40 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Jan 27 15:49:40 crc kubenswrapper[4713]: [+]process-running ok Jan 27 15:49:40 crc kubenswrapper[4713]: healthz check failed Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.963073 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xdbd4" podUID="49a98c2a-bf95-449c-8fee-d2b7ed4381db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:49:40 crc kubenswrapper[4713]: I0127 15:49:40.987664 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:40 crc kubenswrapper[4713]: E0127 15:49:40.988072 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:41.488057829 +0000 UTC m=+149.266267767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.088718 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:41 crc kubenswrapper[4713]: E0127 15:49:41.089549 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:41.589526614 +0000 UTC m=+149.367736552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:41 crc kubenswrapper[4713]: W0127 15:49:41.138867 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-0bfbe6cfd3124856109ae698c857ffd4b2c0610848feeb5a8aa6046426d4751f WatchSource:0}: Error finding container 0bfbe6cfd3124856109ae698c857ffd4b2c0610848feeb5a8aa6046426d4751f: Status 404 returned error can't find the container with id 0bfbe6cfd3124856109ae698c857ffd4b2c0610848feeb5a8aa6046426d4751f Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.154663 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-sndzr"] Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.156096 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sndzr" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.184762 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.191141 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd376e7-8ed4-449c-941d-2276a631f20b-catalog-content\") pod \"community-operators-sndzr\" (UID: \"7fd376e7-8ed4-449c-941d-2276a631f20b\") " pod="openshift-marketplace/community-operators-sndzr" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.191210 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.191286 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gw2qx\" (UniqueName: \"kubernetes.io/projected/7fd376e7-8ed4-449c-941d-2276a631f20b-kube-api-access-gw2qx\") pod \"community-operators-sndzr\" (UID: \"7fd376e7-8ed4-449c-941d-2276a631f20b\") " pod="openshift-marketplace/community-operators-sndzr" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.191318 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd376e7-8ed4-449c-941d-2276a631f20b-utilities\") pod \"community-operators-sndzr\" (UID: \"7fd376e7-8ed4-449c-941d-2276a631f20b\") " pod="openshift-marketplace/community-operators-sndzr" Jan 27 15:49:41 crc kubenswrapper[4713]: E0127 15:49:41.194953 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:41.694937652 +0000 UTC m=+149.473147590 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.295570 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.295867 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gw2qx\" (UniqueName: \"kubernetes.io/projected/7fd376e7-8ed4-449c-941d-2276a631f20b-kube-api-access-gw2qx\") pod \"community-operators-sndzr\" (UID: \"7fd376e7-8ed4-449c-941d-2276a631f20b\") " pod="openshift-marketplace/community-operators-sndzr" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.295899 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd376e7-8ed4-449c-941d-2276a631f20b-utilities\") pod \"community-operators-sndzr\" (UID: \"7fd376e7-8ed4-449c-941d-2276a631f20b\") " pod="openshift-marketplace/community-operators-sndzr" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.295971 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd376e7-8ed4-449c-941d-2276a631f20b-catalog-content\") pod \"community-operators-sndzr\" (UID: \"7fd376e7-8ed4-449c-941d-2276a631f20b\") " pod="openshift-marketplace/community-operators-sndzr" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.296869 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd376e7-8ed4-449c-941d-2276a631f20b-catalog-content\") pod \"community-operators-sndzr\" (UID: \"7fd376e7-8ed4-449c-941d-2276a631f20b\") " pod="openshift-marketplace/community-operators-sndzr" Jan 27 15:49:41 crc kubenswrapper[4713]: E0127 15:49:41.296958 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:41.796937573 +0000 UTC m=+149.575147501 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.297717 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd376e7-8ed4-449c-941d-2276a631f20b-utilities\") pod \"community-operators-sndzr\" (UID: \"7fd376e7-8ed4-449c-941d-2276a631f20b\") " pod="openshift-marketplace/community-operators-sndzr" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.347555 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mc2x2"] Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.349144 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.368743 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sndzr"] Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.399752 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-catalog-content\") pod \"certified-operators-mc2x2\" (UID: \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\") " pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.399838 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-utilities\") pod \"certified-operators-mc2x2\" (UID: \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\") " pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.399870 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mc2x2"] Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.399897 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.400248 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85t4t\" (UniqueName: \"kubernetes.io/projected/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-kube-api-access-85t4t\") pod \"certified-operators-mc2x2\" (UID: \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\") " pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:49:41 crc kubenswrapper[4713]: E0127 15:49:41.401877 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:41.901815455 +0000 UTC m=+149.680025393 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.418966 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.433252 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gw2qx\" (UniqueName: \"kubernetes.io/projected/7fd376e7-8ed4-449c-941d-2276a631f20b-kube-api-access-gw2qx\") pod \"community-operators-sndzr\" (UID: \"7fd376e7-8ed4-449c-941d-2276a631f20b\") " pod="openshift-marketplace/community-operators-sndzr" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.439362 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.498668 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-5cl72" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.501146 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:41 crc kubenswrapper[4713]: E0127 15:49:41.501337 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:42.001296723 +0000 UTC m=+149.779506661 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.501567 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-utilities\") pod \"certified-operators-mc2x2\" (UID: \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\") " pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.501625 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.501715 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85t4t\" (UniqueName: \"kubernetes.io/projected/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-kube-api-access-85t4t\") pod \"certified-operators-mc2x2\" (UID: \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\") " pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.501778 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-catalog-content\") pod \"certified-operators-mc2x2\" (UID: \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\") " pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.503577 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-utilities\") pod \"certified-operators-mc2x2\" (UID: \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\") " pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:49:41 crc kubenswrapper[4713]: E0127 15:49:41.504330 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:42.00431076 +0000 UTC m=+149.782520698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.504766 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-catalog-content\") pod \"certified-operators-mc2x2\" (UID: \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\") " pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.551575 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sndzr" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.559834 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-b2xs9"] Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.581324 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.610427 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:41 crc kubenswrapper[4713]: E0127 15:49:41.610798 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:42.110759807 +0000 UTC m=+149.888969745 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.610876 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.610936 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crl54\" (UniqueName: \"kubernetes.io/projected/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-kube-api-access-crl54\") pod \"community-operators-b2xs9\" (UID: \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\") " pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.610998 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-utilities\") pod \"community-operators-b2xs9\" (UID: \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\") " pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.611017 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-catalog-content\") pod \"community-operators-b2xs9\" (UID: \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\") " pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:49:41 crc kubenswrapper[4713]: E0127 15:49:41.612447 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:42.112426275 +0000 UTC m=+149.890636213 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.637111 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85t4t\" (UniqueName: \"kubernetes.io/projected/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-kube-api-access-85t4t\") pod \"certified-operators-mc2x2\" (UID: \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\") " pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.655187 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2xs9"] Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.671533 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-55xhg"] Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.672762 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55xhg" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.714724 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.714967 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7abaac35-62ca-4792-89e7-c9f32a551079-catalog-content\") pod \"certified-operators-55xhg\" (UID: \"7abaac35-62ca-4792-89e7-c9f32a551079\") " pod="openshift-marketplace/certified-operators-55xhg" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.715008 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crl54\" (UniqueName: \"kubernetes.io/projected/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-kube-api-access-crl54\") pod \"community-operators-b2xs9\" (UID: \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\") " pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.715056 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v95tg\" (UniqueName: \"kubernetes.io/projected/7abaac35-62ca-4792-89e7-c9f32a551079-kube-api-access-v95tg\") pod \"certified-operators-55xhg\" (UID: \"7abaac35-62ca-4792-89e7-c9f32a551079\") " pod="openshift-marketplace/certified-operators-55xhg" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.715108 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-utilities\") pod \"community-operators-b2xs9\" (UID: \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\") " pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.715133 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-catalog-content\") pod \"community-operators-b2xs9\" (UID: \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\") " pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.715152 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7abaac35-62ca-4792-89e7-c9f32a551079-utilities\") pod \"certified-operators-55xhg\" (UID: \"7abaac35-62ca-4792-89e7-c9f32a551079\") " pod="openshift-marketplace/certified-operators-55xhg" Jan 27 15:49:41 crc kubenswrapper[4713]: E0127 15:49:41.715381 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:42.215352932 +0000 UTC m=+149.993562870 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.716318 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-utilities\") pod \"community-operators-b2xs9\" (UID: \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\") " pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.716585 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-catalog-content\") pod \"community-operators-b2xs9\" (UID: \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\") " pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.757552 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.784308 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crl54\" (UniqueName: \"kubernetes.io/projected/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-kube-api-access-crl54\") pod \"community-operators-b2xs9\" (UID: \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\") " pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.817276 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.817395 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7abaac35-62ca-4792-89e7-c9f32a551079-catalog-content\") pod \"certified-operators-55xhg\" (UID: \"7abaac35-62ca-4792-89e7-c9f32a551079\") " pod="openshift-marketplace/certified-operators-55xhg" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.817449 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v95tg\" (UniqueName: \"kubernetes.io/projected/7abaac35-62ca-4792-89e7-c9f32a551079-kube-api-access-v95tg\") pod \"certified-operators-55xhg\" (UID: \"7abaac35-62ca-4792-89e7-c9f32a551079\") " pod="openshift-marketplace/certified-operators-55xhg" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.817500 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7abaac35-62ca-4792-89e7-c9f32a551079-utilities\") pod \"certified-operators-55xhg\" (UID: \"7abaac35-62ca-4792-89e7-c9f32a551079\") " pod="openshift-marketplace/certified-operators-55xhg" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.819684 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7abaac35-62ca-4792-89e7-c9f32a551079-utilities\") pod \"certified-operators-55xhg\" (UID: \"7abaac35-62ca-4792-89e7-c9f32a551079\") " pod="openshift-marketplace/certified-operators-55xhg" Jan 27 15:49:41 crc kubenswrapper[4713]: E0127 15:49:41.820133 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:42.320116081 +0000 UTC m=+150.098326019 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.820435 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7abaac35-62ca-4792-89e7-c9f32a551079-catalog-content\") pod \"certified-operators-55xhg\" (UID: \"7abaac35-62ca-4792-89e7-c9f32a551079\") " pod="openshift-marketplace/certified-operators-55xhg" Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.920060 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:41 crc kubenswrapper[4713]: E0127 15:49:41.920821 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:42.420802794 +0000 UTC m=+150.199012732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:41 crc kubenswrapper[4713]: I0127 15:49:41.923345 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-55xhg"] Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:41.987380 4713 patch_prober.go:28] interesting pod/router-default-5444994796-xdbd4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:49:42 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Jan 27 15:49:42 crc kubenswrapper[4713]: [+]process-running ok Jan 27 15:49:42 crc kubenswrapper[4713]: healthz check failed Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:41.987458 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xdbd4" podUID="49a98c2a-bf95-449c-8fee-d2b7ed4381db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:41.988399 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:41.998834 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v95tg\" (UniqueName: \"kubernetes.io/projected/7abaac35-62ca-4792-89e7-c9f32a551079-kube-api-access-v95tg\") pod \"certified-operators-55xhg\" (UID: \"7abaac35-62ca-4792-89e7-c9f32a551079\") " pod="openshift-marketplace/certified-operators-55xhg" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.022274 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:42 crc kubenswrapper[4713]: E0127 15:49:42.022695 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:42.52268074 +0000 UTC m=+150.300890678 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.057712 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55xhg" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.125847 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:42 crc kubenswrapper[4713]: E0127 15:49:42.126436 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:42.62641617 +0000 UTC m=+150.404626108 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.153434 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"f631c0d49d887c7ca04acac7c9b4d7d0539417f3d62055aa183496a7641408e3"} Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.208080 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c451b9d286dd1fa687ee47577052204d65b39e5c84284f080111def2e73a838a"} Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.208137 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"0bfbe6cfd3124856109ae698c857ffd4b2c0610848feeb5a8aa6046426d4751f"} Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.231242 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:42 crc kubenswrapper[4713]: E0127 15:49:42.231755 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:42.731735196 +0000 UTC m=+150.509945134 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.278453 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"4e07878140f2d68156ff281d181722bf0f8a6ac104a95908b422072ff2662705"} Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.348811 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.349669 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.359192 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:42 crc kubenswrapper[4713]: E0127 15:49:42.359594 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:42.859567585 +0000 UTC m=+150.637777523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.359650 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f3a3e9e-51be-4ba2-88eb-8bef91440a06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5f3a3e9e-51be-4ba2-88eb-8bef91440a06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.359809 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.359974 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f3a3e9e-51be-4ba2-88eb-8bef91440a06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5f3a3e9e-51be-4ba2-88eb-8bef91440a06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.392908 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 27 15:49:42 crc kubenswrapper[4713]: E0127 15:49:42.397449 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:42.897424249 +0000 UTC m=+150.675634187 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.398534 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.454642 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-sndzr"] Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.471860 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.493134 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.493802 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f3a3e9e-51be-4ba2-88eb-8bef91440a06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5f3a3e9e-51be-4ba2-88eb-8bef91440a06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.493872 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f3a3e9e-51be-4ba2-88eb-8bef91440a06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5f3a3e9e-51be-4ba2-88eb-8bef91440a06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.494003 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f3a3e9e-51be-4ba2-88eb-8bef91440a06-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5f3a3e9e-51be-4ba2-88eb-8bef91440a06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:49:42 crc kubenswrapper[4713]: E0127 15:49:42.494134 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:42.994112317 +0000 UTC m=+150.772322255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:42 crc kubenswrapper[4713]: W0127 15:49:42.559992 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fd376e7_8ed4_449c_941d_2276a631f20b.slice/crio-cb585da552a5d177b38d09168943c0267e9a47faddc3b568c60932933055bcd8 WatchSource:0}: Error finding container cb585da552a5d177b38d09168943c0267e9a47faddc3b568c60932933055bcd8: Status 404 returned error can't find the container with id cb585da552a5d177b38d09168943c0267e9a47faddc3b568c60932933055bcd8 Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.560670 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.560711 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.586090 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f3a3e9e-51be-4ba2-88eb-8bef91440a06-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5f3a3e9e-51be-4ba2-88eb-8bef91440a06\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.597337 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:42 crc kubenswrapper[4713]: E0127 15:49:42.597736 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:43.097721304 +0000 UTC m=+150.875931242 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.698407 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:42 crc kubenswrapper[4713]: E0127 15:49:42.698971 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:43.198931061 +0000 UTC m=+150.977140999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.722976 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.736860 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-72qz9" Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.751674 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mc2x2"] Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.801296 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:42 crc kubenswrapper[4713]: E0127 15:49:42.815225 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:43.31519482 +0000 UTC m=+151.093404758 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.902611 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:42 crc kubenswrapper[4713]: E0127 15:49:42.903134 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:43.403103727 +0000 UTC m=+151.181313665 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.978492 4713 patch_prober.go:28] interesting pod/router-default-5444994796-xdbd4 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 27 15:49:42 crc kubenswrapper[4713]: [-]has-synced failed: reason withheld Jan 27 15:49:42 crc kubenswrapper[4713]: [+]process-running ok Jan 27 15:49:42 crc kubenswrapper[4713]: healthz check failed Jan 27 15:49:42 crc kubenswrapper[4713]: I0127 15:49:42.978575 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-xdbd4" podUID="49a98c2a-bf95-449c-8fee-d2b7ed4381db" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.004773 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:43 crc kubenswrapper[4713]: E0127 15:49:43.011387 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:43.511371266 +0000 UTC m=+151.289581204 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.107090 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:43 crc kubenswrapper[4713]: E0127 15:49:43.107450 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:43.607432457 +0000 UTC m=+151.385642395 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.210244 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:43 crc kubenswrapper[4713]: E0127 15:49:43.210899 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:43.710883318 +0000 UTC m=+151.489093256 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.259295 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.259366 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.259302 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.259451 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.310489 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"b5b4696d93b70b8df31738da258f9a594f7380915744aab7673a88dae689b1e3"} Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.311373 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.313823 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:43 crc kubenswrapper[4713]: E0127 15:49:43.314435 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:43.814417103 +0000 UTC m=+151.592627041 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.336904 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sndzr" event={"ID":"7fd376e7-8ed4-449c-941d-2276a631f20b","Type":"ContainerStarted","Data":"7530abd5c6392246e837c132a1687109ffe40beea3b38d05d4355a1e1d84ca07"} Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.336978 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sndzr" event={"ID":"7fd376e7-8ed4-449c-941d-2276a631f20b","Type":"ContainerStarted","Data":"cb585da552a5d177b38d09168943c0267e9a47faddc3b568c60932933055bcd8"} Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.359452 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" event={"ID":"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a","Type":"ContainerStarted","Data":"fec2808c9a1f4591c4e0d40ece0e5fe0a07d78ec85b32370e3af8a1ca52451f5"} Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.369994 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mc2x2" event={"ID":"7f6159db-0f3d-486f-9e71-fda6a1c18f1f","Type":"ContainerStarted","Data":"9dfe35d070a78ced67f6726cee271369c3119b56cde83aaf4ac8a13153c98ad7"} Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.378396 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"b400e6ab832401cc29bfe62817c45fcd9d1aa648010410454f7539330a24df87"} Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.402623 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sjnst"] Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.409483 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjnst" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.416248 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:43 crc kubenswrapper[4713]: E0127 15:49:43.417839 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:43.917818603 +0000 UTC m=+151.696028541 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.439981 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.503967 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjnst"] Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.518995 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.519317 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-utilities\") pod \"redhat-marketplace-sjnst\" (UID: \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\") " pod="openshift-marketplace/redhat-marketplace-sjnst" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.519352 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml77s\" (UniqueName: \"kubernetes.io/projected/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-kube-api-access-ml77s\") pod \"redhat-marketplace-sjnst\" (UID: \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\") " pod="openshift-marketplace/redhat-marketplace-sjnst" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.519415 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-catalog-content\") pod \"redhat-marketplace-sjnst\" (UID: \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\") " pod="openshift-marketplace/redhat-marketplace-sjnst" Jan 27 15:49:43 crc kubenswrapper[4713]: E0127 15:49:43.519573 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:44.019556196 +0000 UTC m=+151.797766134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.541302 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-b2xs9"] Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.541940 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-55xhg"] Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.621681 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-catalog-content\") pod \"redhat-marketplace-sjnst\" (UID: \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\") " pod="openshift-marketplace/redhat-marketplace-sjnst" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.621784 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.621823 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-utilities\") pod \"redhat-marketplace-sjnst\" (UID: \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\") " pod="openshift-marketplace/redhat-marketplace-sjnst" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.621853 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml77s\" (UniqueName: \"kubernetes.io/projected/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-kube-api-access-ml77s\") pod \"redhat-marketplace-sjnst\" (UID: \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\") " pod="openshift-marketplace/redhat-marketplace-sjnst" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.622929 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-catalog-content\") pod \"redhat-marketplace-sjnst\" (UID: \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\") " pod="openshift-marketplace/redhat-marketplace-sjnst" Jan 27 15:49:43 crc kubenswrapper[4713]: E0127 15:49:43.623457 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:44.12343957 +0000 UTC m=+151.901649508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.623782 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-utilities\") pod \"redhat-marketplace-sjnst\" (UID: \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\") " pod="openshift-marketplace/redhat-marketplace-sjnst" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.680634 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.725084 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:43 crc kubenswrapper[4713]: E0127 15:49:43.726323 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:44.226295545 +0000 UTC m=+152.004505493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.812995 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml77s\" (UniqueName: \"kubernetes.io/projected/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-kube-api-access-ml77s\") pod \"redhat-marketplace-sjnst\" (UID: \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\") " pod="openshift-marketplace/redhat-marketplace-sjnst" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.813902 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.813972 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.814632 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjnst" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.815649 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dkjdv"] Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.826969 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:43 crc kubenswrapper[4713]: E0127 15:49:43.829724 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:44.329701495 +0000 UTC m=+152.107911433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.836424 4713 patch_prober.go:28] interesting pod/console-f9d7485db-xffcl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" start-of-body= Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.836517 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-xffcl" podUID="a38fa71e-fb6f-4742-9291-0038d946cfec" containerName="console" probeResult="failure" output="Get \"https://10.217.0.14:8443/health\": dial tcp 10.217.0.14:8443: connect: connection refused" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.848374 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-tfz52" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.848428 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkjdv"] Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.848760 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dkjdv" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.877486 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-qbcpm" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.928199 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:43 crc kubenswrapper[4713]: E0127 15:49:43.930029 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:44.430007117 +0000 UTC m=+152.208217055 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.959123 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.971860 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.980134 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-vkk8k" Jan 27 15:49:43 crc kubenswrapper[4713]: I0127 15:49:43.984155 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.037447 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8aca94a-14f9-473c-b913-be4fececbb18-utilities\") pod \"redhat-marketplace-dkjdv\" (UID: \"c8aca94a-14f9-473c-b913-be4fececbb18\") " pod="openshift-marketplace/redhat-marketplace-dkjdv" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.037513 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8aca94a-14f9-473c-b913-be4fececbb18-catalog-content\") pod \"redhat-marketplace-dkjdv\" (UID: \"c8aca94a-14f9-473c-b913-be4fececbb18\") " pod="openshift-marketplace/redhat-marketplace-dkjdv" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.037547 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.037648 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rfbbm\" (UniqueName: \"kubernetes.io/projected/c8aca94a-14f9-473c-b913-be4fececbb18-kube-api-access-rfbbm\") pod \"redhat-marketplace-dkjdv\" (UID: \"c8aca94a-14f9-473c-b913-be4fececbb18\") " pod="openshift-marketplace/redhat-marketplace-dkjdv" Jan 27 15:49:44 crc kubenswrapper[4713]: E0127 15:49:44.039472 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:44.539451361 +0000 UTC m=+152.317661309 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.141849 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.142102 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8aca94a-14f9-473c-b913-be4fececbb18-utilities\") pod \"redhat-marketplace-dkjdv\" (UID: \"c8aca94a-14f9-473c-b913-be4fececbb18\") " pod="openshift-marketplace/redhat-marketplace-dkjdv" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.142145 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8aca94a-14f9-473c-b913-be4fececbb18-catalog-content\") pod \"redhat-marketplace-dkjdv\" (UID: \"c8aca94a-14f9-473c-b913-be4fececbb18\") " pod="openshift-marketplace/redhat-marketplace-dkjdv" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.142213 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rfbbm\" (UniqueName: \"kubernetes.io/projected/c8aca94a-14f9-473c-b913-be4fececbb18-kube-api-access-rfbbm\") pod \"redhat-marketplace-dkjdv\" (UID: \"c8aca94a-14f9-473c-b913-be4fececbb18\") " pod="openshift-marketplace/redhat-marketplace-dkjdv" Jan 27 15:49:44 crc kubenswrapper[4713]: E0127 15:49:44.143119 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:44.643097988 +0000 UTC m=+152.421307916 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.143499 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8aca94a-14f9-473c-b913-be4fececbb18-utilities\") pod \"redhat-marketplace-dkjdv\" (UID: \"c8aca94a-14f9-473c-b913-be4fececbb18\") " pod="openshift-marketplace/redhat-marketplace-dkjdv" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.164370 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8aca94a-14f9-473c-b913-be4fececbb18-catalog-content\") pod \"redhat-marketplace-dkjdv\" (UID: \"c8aca94a-14f9-473c-b913-be4fececbb18\") " pod="openshift-marketplace/redhat-marketplace-dkjdv" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.193321 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rfbbm\" (UniqueName: \"kubernetes.io/projected/c8aca94a-14f9-473c-b913-be4fececbb18-kube-api-access-rfbbm\") pod \"redhat-marketplace-dkjdv\" (UID: \"c8aca94a-14f9-473c-b913-be4fececbb18\") " pod="openshift-marketplace/redhat-marketplace-dkjdv" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.247207 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:44 crc kubenswrapper[4713]: E0127 15:49:44.247609 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:44.7475943 +0000 UTC m=+152.525804238 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.258797 4713 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.266220 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dkjdv" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.316579 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r4v5v"] Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.350027 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r4v5v"] Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.371316 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r4v5v" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.371929 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.372302 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8597977e-b4b9-4ad9-98ac-61187df61af5-utilities\") pod \"redhat-operators-r4v5v\" (UID: \"8597977e-b4b9-4ad9-98ac-61187df61af5\") " pod="openshift-marketplace/redhat-operators-r4v5v" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.372414 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvc6g\" (UniqueName: \"kubernetes.io/projected/8597977e-b4b9-4ad9-98ac-61187df61af5-kube-api-access-jvc6g\") pod \"redhat-operators-r4v5v\" (UID: \"8597977e-b4b9-4ad9-98ac-61187df61af5\") " pod="openshift-marketplace/redhat-operators-r4v5v" Jan 27 15:49:44 crc kubenswrapper[4713]: E0127 15:49:44.372515 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-27 15:49:44.872485285 +0000 UTC m=+152.650695223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.372597 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8597977e-b4b9-4ad9-98ac-61187df61af5-catalog-content\") pod \"redhat-operators-r4v5v\" (UID: \"8597977e-b4b9-4ad9-98ac-61187df61af5\") " pod="openshift-marketplace/redhat-operators-r4v5v" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.373802 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.411542 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mc2x2" event={"ID":"7f6159db-0f3d-486f-9e71-fda6a1c18f1f","Type":"ContainerDied","Data":"8285266abe8d1a9e06bf9f6164e24c07921352dedbf8d4d909b6322a92aa2d57"} Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.408254 4713 generic.go:334] "Generic (PLEG): container finished" podID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" containerID="8285266abe8d1a9e06bf9f6164e24c07921352dedbf8d4d909b6322a92aa2d57" exitCode=0 Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.450313 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.465947 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-mcjdx"] Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.472475 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5f3a3e9e-51be-4ba2-88eb-8bef91440a06","Type":"ContainerStarted","Data":"4e28759afd6050593e49e4ed2e33faea50fee77728c76b5de57d37c59dedd744"} Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.472705 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mcjdx" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.474655 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2xs9" event={"ID":"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145","Type":"ContainerStarted","Data":"549d55986d4e2662fad556087fa718e9cb476eaff423fa6de381cf6ae4de5e60"} Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.474684 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2xs9" event={"ID":"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145","Type":"ContainerStarted","Data":"429fad458e6e534a2ac3ae5c1a60ed0fed6285cc4084d44475eafbd8a0d7ba82"} Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.474976 4713 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-27T15:49:44.258820041Z","Handler":null,"Name":""} Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.484513 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.484552 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.488335 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8597977e-b4b9-4ad9-98ac-61187df61af5-utilities\") pod \"redhat-operators-r4v5v\" (UID: \"8597977e-b4b9-4ad9-98ac-61187df61af5\") " pod="openshift-marketplace/redhat-operators-r4v5v" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.488429 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68fsg\" (UniqueName: \"kubernetes.io/projected/a2ab03d0-a8a9-4588-91e6-48d50999c219-kube-api-access-68fsg\") pod \"redhat-operators-mcjdx\" (UID: \"a2ab03d0-a8a9-4588-91e6-48d50999c219\") " pod="openshift-marketplace/redhat-operators-mcjdx" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.488499 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvc6g\" (UniqueName: \"kubernetes.io/projected/8597977e-b4b9-4ad9-98ac-61187df61af5-kube-api-access-jvc6g\") pod \"redhat-operators-r4v5v\" (UID: \"8597977e-b4b9-4ad9-98ac-61187df61af5\") " pod="openshift-marketplace/redhat-operators-r4v5v" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.488562 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.488614 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8597977e-b4b9-4ad9-98ac-61187df61af5-catalog-content\") pod \"redhat-operators-r4v5v\" (UID: \"8597977e-b4b9-4ad9-98ac-61187df61af5\") " pod="openshift-marketplace/redhat-operators-r4v5v" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.488670 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2ab03d0-a8a9-4588-91e6-48d50999c219-utilities\") pod \"redhat-operators-mcjdx\" (UID: \"a2ab03d0-a8a9-4588-91e6-48d50999c219\") " pod="openshift-marketplace/redhat-operators-mcjdx" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.488692 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2ab03d0-a8a9-4588-91e6-48d50999c219-catalog-content\") pod \"redhat-operators-mcjdx\" (UID: \"a2ab03d0-a8a9-4588-91e6-48d50999c219\") " pod="openshift-marketplace/redhat-operators-mcjdx" Jan 27 15:49:44 crc kubenswrapper[4713]: E0127 15:49:44.490217 4713 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-27 15:49:44.990201384 +0000 UTC m=+152.768411322 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-jjr22" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.491478 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8597977e-b4b9-4ad9-98ac-61187df61af5-utilities\") pod \"redhat-operators-r4v5v\" (UID: \"8597977e-b4b9-4ad9-98ac-61187df61af5\") " pod="openshift-marketplace/redhat-operators-r4v5v" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.506426 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8597977e-b4b9-4ad9-98ac-61187df61af5-catalog-content\") pod \"redhat-operators-r4v5v\" (UID: \"8597977e-b4b9-4ad9-98ac-61187df61af5\") " pod="openshift-marketplace/redhat-operators-r4v5v" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.511837 4713 generic.go:334] "Generic (PLEG): container finished" podID="7fd376e7-8ed4-449c-941d-2276a631f20b" containerID="7530abd5c6392246e837c132a1687109ffe40beea3b38d05d4355a1e1d84ca07" exitCode=0 Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.512275 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sndzr" event={"ID":"7fd376e7-8ed4-449c-941d-2276a631f20b","Type":"ContainerDied","Data":"7530abd5c6392246e837c132a1687109ffe40beea3b38d05d4355a1e1d84ca07"} Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.521544 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mcjdx"] Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.524955 4713 generic.go:334] "Generic (PLEG): container finished" podID="7abaac35-62ca-4792-89e7-c9f32a551079" containerID="72eb234ea02c5a204cd2d6177c1c37075ba1420ff8a653862a4141890c33e13a" exitCode=0 Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.525095 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55xhg" event={"ID":"7abaac35-62ca-4792-89e7-c9f32a551079","Type":"ContainerDied","Data":"72eb234ea02c5a204cd2d6177c1c37075ba1420ff8a653862a4141890c33e13a"} Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.525137 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55xhg" event={"ID":"7abaac35-62ca-4792-89e7-c9f32a551079","Type":"ContainerStarted","Data":"8517fba529d0830971338f6d426a0d60a83bd75a65e66956d76727649b3f5469"} Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.535686 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.562337 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" event={"ID":"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a","Type":"ContainerStarted","Data":"5fdc13727801eb3184c9ffb1f9c6d281e2120334920f238de4b564d4ea39f7ac"} Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.562400 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" event={"ID":"adcd14ab-0774-4aaa-9e30-7b861f8c5c1a","Type":"ContainerStarted","Data":"b04efdaf62f812c0170af223bfb1d79e3290a25befa4904b70c77dc5584a20c5"} Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.562426 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvc6g\" (UniqueName: \"kubernetes.io/projected/8597977e-b4b9-4ad9-98ac-61187df61af5-kube-api-access-jvc6g\") pod \"redhat-operators-r4v5v\" (UID: \"8597977e-b4b9-4ad9-98ac-61187df61af5\") " pod="openshift-marketplace/redhat-operators-r4v5v" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.573214 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjnst"] Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.582414 4713 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.582512 4713 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.591636 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-xdbd4" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.591947 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.592965 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2ab03d0-a8a9-4588-91e6-48d50999c219-utilities\") pod \"redhat-operators-mcjdx\" (UID: \"a2ab03d0-a8a9-4588-91e6-48d50999c219\") " pod="openshift-marketplace/redhat-operators-mcjdx" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.593154 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2ab03d0-a8a9-4588-91e6-48d50999c219-catalog-content\") pod \"redhat-operators-mcjdx\" (UID: \"a2ab03d0-a8a9-4588-91e6-48d50999c219\") " pod="openshift-marketplace/redhat-operators-mcjdx" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.593219 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68fsg\" (UniqueName: \"kubernetes.io/projected/a2ab03d0-a8a9-4588-91e6-48d50999c219-kube-api-access-68fsg\") pod \"redhat-operators-mcjdx\" (UID: \"a2ab03d0-a8a9-4588-91e6-48d50999c219\") " pod="openshift-marketplace/redhat-operators-mcjdx" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.593934 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2ab03d0-a8a9-4588-91e6-48d50999c219-catalog-content\") pod \"redhat-operators-mcjdx\" (UID: \"a2ab03d0-a8a9-4588-91e6-48d50999c219\") " pod="openshift-marketplace/redhat-operators-mcjdx" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.594104 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2ab03d0-a8a9-4588-91e6-48d50999c219-utilities\") pod \"redhat-operators-mcjdx\" (UID: \"a2ab03d0-a8a9-4588-91e6-48d50999c219\") " pod="openshift-marketplace/redhat-operators-mcjdx" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.606355 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.643568 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68fsg\" (UniqueName: \"kubernetes.io/projected/a2ab03d0-a8a9-4588-91e6-48d50999c219-kube-api-access-68fsg\") pod \"redhat-operators-mcjdx\" (UID: \"a2ab03d0-a8a9-4588-91e6-48d50999c219\") " pod="openshift-marketplace/redhat-operators-mcjdx" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.696674 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.720002 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-vdwzh" podStartSLOduration=13.719979673 podStartE2EDuration="13.719979673s" podCreationTimestamp="2026-01-27 15:49:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:44.717143732 +0000 UTC m=+152.495353690" watchObservedRunningTime="2026-01-27 15:49:44.719979673 +0000 UTC m=+152.498189611" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.734127 4713 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.734593 4713 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.803214 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-jjr22\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.807344 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r4v5v" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.854127 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mcjdx" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.914747 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.915958 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkjdv"] Jan 27 15:49:44 crc kubenswrapper[4713]: W0127 15:49:44.939694 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8aca94a_14f9_473c_b913_be4fececbb18.slice/crio-471589bee8e0b5212c9057cba07a05bc7a8d7d019e07a03d56f7c812a388fb02 WatchSource:0}: Error finding container 471589bee8e0b5212c9057cba07a05bc7a8d7d019e07a03d56f7c812a388fb02: Status 404 returned error can't find the container with id 471589bee8e0b5212c9057cba07a05bc7a8d7d019e07a03d56f7c812a388fb02 Jan 27 15:49:44 crc kubenswrapper[4713]: I0127 15:49:44.950075 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.197621 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r4v5v"] Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.245265 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-mcjdx"] Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.307139 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jjr22"] Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.569672 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4v5v" event={"ID":"8597977e-b4b9-4ad9-98ac-61187df61af5","Type":"ContainerStarted","Data":"71e982eb5fec648320d3d344cb144d10b17ccb4115a727f109cb857ca89022b8"} Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.571178 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5f3a3e9e-51be-4ba2-88eb-8bef91440a06","Type":"ContainerStarted","Data":"5a3471c0a5e5a1d6d64fe265f6209b66d1489b3fe00aee44bd85cf80a7ff1d56"} Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.574751 4713 generic.go:334] "Generic (PLEG): container finished" podID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" containerID="549d55986d4e2662fad556087fa718e9cb476eaff423fa6de381cf6ae4de5e60" exitCode=0 Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.574811 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2xs9" event={"ID":"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145","Type":"ContainerDied","Data":"549d55986d4e2662fad556087fa718e9cb476eaff423fa6de381cf6ae4de5e60"} Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.582210 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcjdx" event={"ID":"a2ab03d0-a8a9-4588-91e6-48d50999c219","Type":"ContainerStarted","Data":"9b0a6e8b24d95990ca45ee8194cc597ac4ad7344854c7a70d5d9bf7003a3d1a8"} Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.584287 4713 generic.go:334] "Generic (PLEG): container finished" podID="49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1" containerID="67ae5dbfee5976259542752aa0fb489e0dd126a8db80fb1657b1bb620d3cc592" exitCode=0 Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.584373 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjnst" event={"ID":"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1","Type":"ContainerDied","Data":"67ae5dbfee5976259542752aa0fb489e0dd126a8db80fb1657b1bb620d3cc592"} Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.584422 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjnst" event={"ID":"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1","Type":"ContainerStarted","Data":"d0c6e8c3fa93fde547baaec8f7d219d897fcc3f66f34309def0ea40be0f167cd"} Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.585525 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" event={"ID":"4994bfc4-7e92-4877-a981-6d94b4df000a","Type":"ContainerStarted","Data":"daa8ff730b22b66d1845c5aa1735bbe07dc8f2902cb530c9c8311963cd33d80e"} Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.588848 4713 generic.go:334] "Generic (PLEG): container finished" podID="c8aca94a-14f9-473c-b913-be4fececbb18" containerID="de6a286a59bb52be72786fb6aa3409ce560521af1af6dcaa0b947a9bf257cd08" exitCode=0 Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.588978 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkjdv" event={"ID":"c8aca94a-14f9-473c-b913-be4fececbb18","Type":"ContainerDied","Data":"de6a286a59bb52be72786fb6aa3409ce560521af1af6dcaa0b947a9bf257cd08"} Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.589058 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkjdv" event={"ID":"c8aca94a-14f9-473c-b913-be4fececbb18","Type":"ContainerStarted","Data":"471589bee8e0b5212c9057cba07a05bc7a8d7d019e07a03d56f7c812a388fb02"} Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.600958 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.600937115 podStartE2EDuration="3.600937115s" podCreationTimestamp="2026-01-27 15:49:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:45.596537429 +0000 UTC m=+153.374747367" watchObservedRunningTime="2026-01-27 15:49:45.600937115 +0000 UTC m=+153.379147053" Jan 27 15:49:45 crc kubenswrapper[4713]: I0127 15:49:45.613412 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-j9gm5" Jan 27 15:49:46 crc kubenswrapper[4713]: I0127 15:49:46.608919 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcjdx" event={"ID":"a2ab03d0-a8a9-4588-91e6-48d50999c219","Type":"ContainerStarted","Data":"86090809c9f42092fbf6d9e334a8bd97f1addfc76efaababe6e18e3156fa6869"} Jan 27 15:49:46 crc kubenswrapper[4713]: I0127 15:49:46.618391 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" event={"ID":"4994bfc4-7e92-4877-a981-6d94b4df000a","Type":"ContainerStarted","Data":"a22ace44e1f5d186c5f05236660b55dfa2b296cc07956272ae6d73c498806db6"} Jan 27 15:49:46 crc kubenswrapper[4713]: I0127 15:49:46.618739 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:49:46 crc kubenswrapper[4713]: I0127 15:49:46.621004 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4v5v" event={"ID":"8597977e-b4b9-4ad9-98ac-61187df61af5","Type":"ContainerStarted","Data":"70d466e7b15e205c50d6fc776d11e5c27f023b259253de035389bcf5cdba21d5"} Jan 27 15:49:46 crc kubenswrapper[4713]: I0127 15:49:46.625878 4713 generic.go:334] "Generic (PLEG): container finished" podID="5f3a3e9e-51be-4ba2-88eb-8bef91440a06" containerID="5a3471c0a5e5a1d6d64fe265f6209b66d1489b3fe00aee44bd85cf80a7ff1d56" exitCode=0 Jan 27 15:49:46 crc kubenswrapper[4713]: I0127 15:49:46.625961 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5f3a3e9e-51be-4ba2-88eb-8bef91440a06","Type":"ContainerDied","Data":"5a3471c0a5e5a1d6d64fe265f6209b66d1489b3fe00aee44bd85cf80a7ff1d56"} Jan 27 15:49:46 crc kubenswrapper[4713]: I0127 15:49:46.674894 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" podStartSLOduration=130.674871891 podStartE2EDuration="2m10.674871891s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:46.670781084 +0000 UTC m=+154.448991022" watchObservedRunningTime="2026-01-27 15:49:46.674871891 +0000 UTC m=+154.453081829" Jan 27 15:49:47 crc kubenswrapper[4713]: I0127 15:49:47.646705 4713 generic.go:334] "Generic (PLEG): container finished" podID="8597977e-b4b9-4ad9-98ac-61187df61af5" containerID="70d466e7b15e205c50d6fc776d11e5c27f023b259253de035389bcf5cdba21d5" exitCode=0 Jan 27 15:49:47 crc kubenswrapper[4713]: I0127 15:49:47.646816 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4v5v" event={"ID":"8597977e-b4b9-4ad9-98ac-61187df61af5","Type":"ContainerDied","Data":"70d466e7b15e205c50d6fc776d11e5c27f023b259253de035389bcf5cdba21d5"} Jan 27 15:49:47 crc kubenswrapper[4713]: I0127 15:49:47.671595 4713 generic.go:334] "Generic (PLEG): container finished" podID="a2ab03d0-a8a9-4588-91e6-48d50999c219" containerID="86090809c9f42092fbf6d9e334a8bd97f1addfc76efaababe6e18e3156fa6869" exitCode=0 Jan 27 15:49:47 crc kubenswrapper[4713]: I0127 15:49:47.671682 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcjdx" event={"ID":"a2ab03d0-a8a9-4588-91e6-48d50999c219","Type":"ContainerDied","Data":"86090809c9f42092fbf6d9e334a8bd97f1addfc76efaababe6e18e3156fa6869"} Jan 27 15:49:47 crc kubenswrapper[4713]: I0127 15:49:47.680114 4713 generic.go:334] "Generic (PLEG): container finished" podID="86cad552-e907-4741-9709-e3952fcf470a" containerID="13b2107423485d22474ff7743ac408e9012f1985ca272a898da0fc322472eee6" exitCode=0 Jan 27 15:49:47 crc kubenswrapper[4713]: I0127 15:49:47.680459 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" event={"ID":"86cad552-e907-4741-9709-e3952fcf470a","Type":"ContainerDied","Data":"13b2107423485d22474ff7743ac408e9012f1985ca272a898da0fc322472eee6"} Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.285148 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.384383 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f3a3e9e-51be-4ba2-88eb-8bef91440a06-kubelet-dir\") pod \"5f3a3e9e-51be-4ba2-88eb-8bef91440a06\" (UID: \"5f3a3e9e-51be-4ba2-88eb-8bef91440a06\") " Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.384845 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f3a3e9e-51be-4ba2-88eb-8bef91440a06-kube-api-access\") pod \"5f3a3e9e-51be-4ba2-88eb-8bef91440a06\" (UID: \"5f3a3e9e-51be-4ba2-88eb-8bef91440a06\") " Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.384555 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f3a3e9e-51be-4ba2-88eb-8bef91440a06-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5f3a3e9e-51be-4ba2-88eb-8bef91440a06" (UID: "5f3a3e9e-51be-4ba2-88eb-8bef91440a06"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.385541 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5f3a3e9e-51be-4ba2-88eb-8bef91440a06-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.408658 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f3a3e9e-51be-4ba2-88eb-8bef91440a06-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5f3a3e9e-51be-4ba2-88eb-8bef91440a06" (UID: "5f3a3e9e-51be-4ba2-88eb-8bef91440a06"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.487264 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5f3a3e9e-51be-4ba2-88eb-8bef91440a06-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.591356 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 15:49:48 crc kubenswrapper[4713]: E0127 15:49:48.592333 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f3a3e9e-51be-4ba2-88eb-8bef91440a06" containerName="pruner" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.592357 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f3a3e9e-51be-4ba2-88eb-8bef91440a06" containerName="pruner" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.592514 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f3a3e9e-51be-4ba2-88eb-8bef91440a06" containerName="pruner" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.593122 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.601597 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.601797 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.623398 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.694916 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5bc4eee5-2e04-4225-9cb0-d944bfa66a04-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5bc4eee5-2e04-4225-9cb0-d944bfa66a04\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.695026 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5bc4eee5-2e04-4225-9cb0-d944bfa66a04-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5bc4eee5-2e04-4225-9cb0-d944bfa66a04\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.780801 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.784490 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"5f3a3e9e-51be-4ba2-88eb-8bef91440a06","Type":"ContainerDied","Data":"4e28759afd6050593e49e4ed2e33faea50fee77728c76b5de57d37c59dedd744"} Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.784676 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e28759afd6050593e49e4ed2e33faea50fee77728c76b5de57d37c59dedd744" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.796531 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5bc4eee5-2e04-4225-9cb0-d944bfa66a04-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5bc4eee5-2e04-4225-9cb0-d944bfa66a04\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.796620 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5bc4eee5-2e04-4225-9cb0-d944bfa66a04-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5bc4eee5-2e04-4225-9cb0-d944bfa66a04\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.797223 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5bc4eee5-2e04-4225-9cb0-d944bfa66a04-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"5bc4eee5-2e04-4225-9cb0-d944bfa66a04\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.839401 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5bc4eee5-2e04-4225-9cb0-d944bfa66a04-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"5bc4eee5-2e04-4225-9cb0-d944bfa66a04\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:49:48 crc kubenswrapper[4713]: I0127 15:49:48.936701 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.444351 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-vmvmj" Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.724815 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.757746 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.829369 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxq4s\" (UniqueName: \"kubernetes.io/projected/86cad552-e907-4741-9709-e3952fcf470a-kube-api-access-xxq4s\") pod \"86cad552-e907-4741-9709-e3952fcf470a\" (UID: \"86cad552-e907-4741-9709-e3952fcf470a\") " Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.829511 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86cad552-e907-4741-9709-e3952fcf470a-config-volume\") pod \"86cad552-e907-4741-9709-e3952fcf470a\" (UID: \"86cad552-e907-4741-9709-e3952fcf470a\") " Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.829579 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86cad552-e907-4741-9709-e3952fcf470a-secret-volume\") pod \"86cad552-e907-4741-9709-e3952fcf470a\" (UID: \"86cad552-e907-4741-9709-e3952fcf470a\") " Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.836016 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86cad552-e907-4741-9709-e3952fcf470a-config-volume" (OuterVolumeSpecName: "config-volume") pod "86cad552-e907-4741-9709-e3952fcf470a" (UID: "86cad552-e907-4741-9709-e3952fcf470a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.844481 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" event={"ID":"86cad552-e907-4741-9709-e3952fcf470a","Type":"ContainerDied","Data":"b10927a4c65e1528b280042005a4b3ece118ffffd2edab9d21688b7748df14d3"} Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.844522 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b10927a4c65e1528b280042005a4b3ece118ffffd2edab9d21688b7748df14d3" Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.844594 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492145-r8sxx" Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.849492 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86cad552-e907-4741-9709-e3952fcf470a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "86cad552-e907-4741-9709-e3952fcf470a" (UID: "86cad552-e907-4741-9709-e3952fcf470a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.849549 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86cad552-e907-4741-9709-e3952fcf470a-kube-api-access-xxq4s" (OuterVolumeSpecName: "kube-api-access-xxq4s") pod "86cad552-e907-4741-9709-e3952fcf470a" (UID: "86cad552-e907-4741-9709-e3952fcf470a"). InnerVolumeSpecName "kube-api-access-xxq4s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.932091 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxq4s\" (UniqueName: \"kubernetes.io/projected/86cad552-e907-4741-9709-e3952fcf470a-kube-api-access-xxq4s\") on node \"crc\" DevicePath \"\"" Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.932153 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86cad552-e907-4741-9709-e3952fcf470a-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 15:49:49 crc kubenswrapper[4713]: I0127 15:49:49.932167 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/86cad552-e907-4741-9709-e3952fcf470a-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 15:49:50 crc kubenswrapper[4713]: I0127 15:49:50.871143 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5bc4eee5-2e04-4225-9cb0-d944bfa66a04","Type":"ContainerStarted","Data":"9f3e9f05e454e5646e7a3ce94da82d50ecc96b386eae5b4ee55e095fe6471432"} Jan 27 15:49:51 crc kubenswrapper[4713]: I0127 15:49:51.907791 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5bc4eee5-2e04-4225-9cb0-d944bfa66a04","Type":"ContainerStarted","Data":"7cbb95e8c56ee63960cc3b8444d5517411d8e274c3a7ef2d0424fab4f7a27fa4"} Jan 27 15:49:51 crc kubenswrapper[4713]: I0127 15:49:51.926932 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.9269018239999998 podStartE2EDuration="3.926901824s" podCreationTimestamp="2026-01-27 15:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:49:51.924906027 +0000 UTC m=+159.703115985" watchObservedRunningTime="2026-01-27 15:49:51.926901824 +0000 UTC m=+159.705111762" Jan 27 15:49:52 crc kubenswrapper[4713]: I0127 15:49:52.925379 4713 generic.go:334] "Generic (PLEG): container finished" podID="5bc4eee5-2e04-4225-9cb0-d944bfa66a04" containerID="7cbb95e8c56ee63960cc3b8444d5517411d8e274c3a7ef2d0424fab4f7a27fa4" exitCode=0 Jan 27 15:49:52 crc kubenswrapper[4713]: I0127 15:49:52.940890 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5bc4eee5-2e04-4225-9cb0-d944bfa66a04","Type":"ContainerDied","Data":"7cbb95e8c56ee63960cc3b8444d5517411d8e274c3a7ef2d0424fab4f7a27fa4"} Jan 27 15:49:53 crc kubenswrapper[4713]: I0127 15:49:53.260145 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:49:53 crc kubenswrapper[4713]: I0127 15:49:53.260308 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:49:53 crc kubenswrapper[4713]: I0127 15:49:53.260315 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:49:53 crc kubenswrapper[4713]: I0127 15:49:53.260371 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:49:53 crc kubenswrapper[4713]: I0127 15:49:53.889567 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:53 crc kubenswrapper[4713]: I0127 15:49:53.893988 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-xffcl" Jan 27 15:49:54 crc kubenswrapper[4713]: I0127 15:49:54.564868 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:49:54 crc kubenswrapper[4713]: I0127 15:49:54.731737 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5bc4eee5-2e04-4225-9cb0-d944bfa66a04-kubelet-dir\") pod \"5bc4eee5-2e04-4225-9cb0-d944bfa66a04\" (UID: \"5bc4eee5-2e04-4225-9cb0-d944bfa66a04\") " Jan 27 15:49:54 crc kubenswrapper[4713]: I0127 15:49:54.731862 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5bc4eee5-2e04-4225-9cb0-d944bfa66a04-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5bc4eee5-2e04-4225-9cb0-d944bfa66a04" (UID: "5bc4eee5-2e04-4225-9cb0-d944bfa66a04"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:49:54 crc kubenswrapper[4713]: I0127 15:49:54.731882 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5bc4eee5-2e04-4225-9cb0-d944bfa66a04-kube-api-access\") pod \"5bc4eee5-2e04-4225-9cb0-d944bfa66a04\" (UID: \"5bc4eee5-2e04-4225-9cb0-d944bfa66a04\") " Jan 27 15:49:54 crc kubenswrapper[4713]: I0127 15:49:54.732611 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5bc4eee5-2e04-4225-9cb0-d944bfa66a04-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:49:54 crc kubenswrapper[4713]: I0127 15:49:54.744141 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bc4eee5-2e04-4225-9cb0-d944bfa66a04-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5bc4eee5-2e04-4225-9cb0-d944bfa66a04" (UID: "5bc4eee5-2e04-4225-9cb0-d944bfa66a04"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:49:54 crc kubenswrapper[4713]: I0127 15:49:54.833618 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5bc4eee5-2e04-4225-9cb0-d944bfa66a04-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:49:55 crc kubenswrapper[4713]: I0127 15:49:55.094746 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"5bc4eee5-2e04-4225-9cb0-d944bfa66a04","Type":"ContainerDied","Data":"9f3e9f05e454e5646e7a3ce94da82d50ecc96b386eae5b4ee55e095fe6471432"} Jan 27 15:49:55 crc kubenswrapper[4713]: I0127 15:49:55.094864 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9f3e9f05e454e5646e7a3ce94da82d50ecc96b386eae5b4ee55e095fe6471432" Jan 27 15:49:55 crc kubenswrapper[4713]: I0127 15:49:55.094885 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 27 15:49:56 crc kubenswrapper[4713]: I0127 15:49:56.918778 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:49:58 crc kubenswrapper[4713]: I0127 15:49:58.409391 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs\") pod \"network-metrics-daemon-mdw5k\" (UID: \"81e4c47d-af8d-44f0-beff-17cf5f133ff7\") " pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:58 crc kubenswrapper[4713]: I0127 15:49:58.416704 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81e4c47d-af8d-44f0-beff-17cf5f133ff7-metrics-certs\") pod \"network-metrics-daemon-mdw5k\" (UID: \"81e4c47d-af8d-44f0-beff-17cf5f133ff7\") " pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:49:58 crc kubenswrapper[4713]: I0127 15:49:58.618784 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-mdw5k" Jan 27 15:50:03 crc kubenswrapper[4713]: I0127 15:50:03.259087 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:50:03 crc kubenswrapper[4713]: I0127 15:50:03.259472 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:50:03 crc kubenswrapper[4713]: I0127 15:50:03.259513 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-mkwg6" Jan 27 15:50:03 crc kubenswrapper[4713]: I0127 15:50:03.259087 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:50:03 crc kubenswrapper[4713]: I0127 15:50:03.259552 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:50:03 crc kubenswrapper[4713]: I0127 15:50:03.259985 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"7e6d1e97c9d4e9de2badf45747b38b582b7f80e94a2f11900b11d6bade108be5"} pod="openshift-console/downloads-7954f5f757-mkwg6" containerMessage="Container download-server failed liveness probe, will be restarted" Jan 27 15:50:03 crc kubenswrapper[4713]: I0127 15:50:03.260187 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:50:03 crc kubenswrapper[4713]: I0127 15:50:03.260200 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" containerID="cri-o://7e6d1e97c9d4e9de2badf45747b38b582b7f80e94a2f11900b11d6bade108be5" gracePeriod=2 Jan 27 15:50:03 crc kubenswrapper[4713]: I0127 15:50:03.260210 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:50:04 crc kubenswrapper[4713]: I0127 15:50:04.955944 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:50:10 crc kubenswrapper[4713]: I0127 15:50:10.275085 4713 generic.go:334] "Generic (PLEG): container finished" podID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerID="7e6d1e97c9d4e9de2badf45747b38b582b7f80e94a2f11900b11d6bade108be5" exitCode=0 Jan 27 15:50:10 crc kubenswrapper[4713]: I0127 15:50:10.275135 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mkwg6" event={"ID":"01f5e6a0-654f-41cb-a694-2c86ca8523d8","Type":"ContainerDied","Data":"7e6d1e97c9d4e9de2badf45747b38b582b7f80e94a2f11900b11d6bade108be5"} Jan 27 15:50:12 crc kubenswrapper[4713]: I0127 15:50:12.555320 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:50:12 crc kubenswrapper[4713]: I0127 15:50:12.555939 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:50:13 crc kubenswrapper[4713]: I0127 15:50:13.259290 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:50:13 crc kubenswrapper[4713]: I0127 15:50:13.259378 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:50:14 crc kubenswrapper[4713]: I0127 15:50:14.332011 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-zfnxn" Jan 27 15:50:20 crc kubenswrapper[4713]: I0127 15:50:20.248452 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 27 15:50:23 crc kubenswrapper[4713]: I0127 15:50:23.260728 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:50:23 crc kubenswrapper[4713]: I0127 15:50:23.260838 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:50:24 crc kubenswrapper[4713]: I0127 15:50:24.789819 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 15:50:24 crc kubenswrapper[4713]: E0127 15:50:24.790157 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bc4eee5-2e04-4225-9cb0-d944bfa66a04" containerName="pruner" Jan 27 15:50:24 crc kubenswrapper[4713]: I0127 15:50:24.790171 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bc4eee5-2e04-4225-9cb0-d944bfa66a04" containerName="pruner" Jan 27 15:50:24 crc kubenswrapper[4713]: E0127 15:50:24.790186 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86cad552-e907-4741-9709-e3952fcf470a" containerName="collect-profiles" Jan 27 15:50:24 crc kubenswrapper[4713]: I0127 15:50:24.790192 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="86cad552-e907-4741-9709-e3952fcf470a" containerName="collect-profiles" Jan 27 15:50:24 crc kubenswrapper[4713]: I0127 15:50:24.790313 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="86cad552-e907-4741-9709-e3952fcf470a" containerName="collect-profiles" Jan 27 15:50:24 crc kubenswrapper[4713]: I0127 15:50:24.790324 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bc4eee5-2e04-4225-9cb0-d944bfa66a04" containerName="pruner" Jan 27 15:50:24 crc kubenswrapper[4713]: I0127 15:50:24.790791 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:50:24 crc kubenswrapper[4713]: I0127 15:50:24.792887 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 15:50:24 crc kubenswrapper[4713]: I0127 15:50:24.793960 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 27 15:50:24 crc kubenswrapper[4713]: I0127 15:50:24.794284 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 27 15:50:24 crc kubenswrapper[4713]: I0127 15:50:24.851570 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24b559b0-3afe-45d5-aa0a-23fa47ed20de-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"24b559b0-3afe-45d5-aa0a-23fa47ed20de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:50:24 crc kubenswrapper[4713]: I0127 15:50:24.851648 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/24b559b0-3afe-45d5-aa0a-23fa47ed20de-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"24b559b0-3afe-45d5-aa0a-23fa47ed20de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:50:24 crc kubenswrapper[4713]: I0127 15:50:24.952938 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/24b559b0-3afe-45d5-aa0a-23fa47ed20de-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"24b559b0-3afe-45d5-aa0a-23fa47ed20de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:50:24 crc kubenswrapper[4713]: I0127 15:50:24.953132 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24b559b0-3afe-45d5-aa0a-23fa47ed20de-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"24b559b0-3afe-45d5-aa0a-23fa47ed20de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:50:24 crc kubenswrapper[4713]: I0127 15:50:24.953182 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/24b559b0-3afe-45d5-aa0a-23fa47ed20de-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"24b559b0-3afe-45d5-aa0a-23fa47ed20de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:50:25 crc kubenswrapper[4713]: I0127 15:50:24.991059 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24b559b0-3afe-45d5-aa0a-23fa47ed20de-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"24b559b0-3afe-45d5-aa0a-23fa47ed20de\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:50:25 crc kubenswrapper[4713]: I0127 15:50:25.120872 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:50:29 crc kubenswrapper[4713]: I0127 15:50:29.981206 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 15:50:29 crc kubenswrapper[4713]: I0127 15:50:29.982899 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:50:29 crc kubenswrapper[4713]: I0127 15:50:29.996406 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 15:50:30 crc kubenswrapper[4713]: I0127 15:50:30.132600 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b10c0e8-a755-4134-a03c-4c84d3e05238-kube-api-access\") pod \"installer-9-crc\" (UID: \"3b10c0e8-a755-4134-a03c-4c84d3e05238\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:50:30 crc kubenswrapper[4713]: I0127 15:50:30.132889 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b10c0e8-a755-4134-a03c-4c84d3e05238-var-lock\") pod \"installer-9-crc\" (UID: \"3b10c0e8-a755-4134-a03c-4c84d3e05238\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:50:30 crc kubenswrapper[4713]: I0127 15:50:30.133023 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b10c0e8-a755-4134-a03c-4c84d3e05238-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3b10c0e8-a755-4134-a03c-4c84d3e05238\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:50:30 crc kubenswrapper[4713]: I0127 15:50:30.234663 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b10c0e8-a755-4134-a03c-4c84d3e05238-var-lock\") pod \"installer-9-crc\" (UID: \"3b10c0e8-a755-4134-a03c-4c84d3e05238\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:50:30 crc kubenswrapper[4713]: I0127 15:50:30.234744 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b10c0e8-a755-4134-a03c-4c84d3e05238-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3b10c0e8-a755-4134-a03c-4c84d3e05238\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:50:30 crc kubenswrapper[4713]: I0127 15:50:30.234817 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b10c0e8-a755-4134-a03c-4c84d3e05238-kube-api-access\") pod \"installer-9-crc\" (UID: \"3b10c0e8-a755-4134-a03c-4c84d3e05238\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:50:30 crc kubenswrapper[4713]: I0127 15:50:30.234884 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b10c0e8-a755-4134-a03c-4c84d3e05238-var-lock\") pod \"installer-9-crc\" (UID: \"3b10c0e8-a755-4134-a03c-4c84d3e05238\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:50:30 crc kubenswrapper[4713]: I0127 15:50:30.234929 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b10c0e8-a755-4134-a03c-4c84d3e05238-kubelet-dir\") pod \"installer-9-crc\" (UID: \"3b10c0e8-a755-4134-a03c-4c84d3e05238\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:50:30 crc kubenswrapper[4713]: I0127 15:50:30.260149 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b10c0e8-a755-4134-a03c-4c84d3e05238-kube-api-access\") pod \"installer-9-crc\" (UID: \"3b10c0e8-a755-4134-a03c-4c84d3e05238\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:50:30 crc kubenswrapper[4713]: I0127 15:50:30.330424 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:50:33 crc kubenswrapper[4713]: I0127 15:50:33.261188 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:50:33 crc kubenswrapper[4713]: I0127 15:50:33.262342 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:50:38 crc kubenswrapper[4713]: E0127 15:50:38.805722 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 27 15:50:38 crc kubenswrapper[4713]: E0127 15:50:38.806950 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-crl54,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-b2xs9_openshift-marketplace(9f1a4ce5-62ba-41c3-bad8-ee8a9054a145): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:50:38 crc kubenswrapper[4713]: E0127 15:50:38.808128 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-b2xs9" podUID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" Jan 27 15:50:40 crc kubenswrapper[4713]: E0127 15:50:40.452917 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-b2xs9" podUID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" Jan 27 15:50:40 crc kubenswrapper[4713]: E0127 15:50:40.531011 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 27 15:50:40 crc kubenswrapper[4713]: E0127 15:50:40.532151 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-85t4t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-mc2x2_openshift-marketplace(7f6159db-0f3d-486f-9e71-fda6a1c18f1f): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:50:40 crc kubenswrapper[4713]: E0127 15:50:40.533370 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-mc2x2" podUID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" Jan 27 15:50:42 crc kubenswrapper[4713]: I0127 15:50:42.555974 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:50:42 crc kubenswrapper[4713]: I0127 15:50:42.557385 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:50:42 crc kubenswrapper[4713]: I0127 15:50:42.557467 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:50:42 crc kubenswrapper[4713]: I0127 15:50:42.558930 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c"} pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:50:42 crc kubenswrapper[4713]: I0127 15:50:42.559528 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" containerID="cri-o://6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c" gracePeriod=600 Jan 27 15:50:43 crc kubenswrapper[4713]: I0127 15:50:43.259609 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:50:43 crc kubenswrapper[4713]: I0127 15:50:43.260173 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:50:43 crc kubenswrapper[4713]: I0127 15:50:43.496990 4713 generic.go:334] "Generic (PLEG): container finished" podID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerID="6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c" exitCode=0 Jan 27 15:50:43 crc kubenswrapper[4713]: I0127 15:50:43.497088 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerDied","Data":"6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c"} Jan 27 15:50:45 crc kubenswrapper[4713]: E0127 15:50:45.369627 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-mc2x2" podUID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" Jan 27 15:50:45 crc kubenswrapper[4713]: E0127 15:50:45.489487 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 27 15:50:45 crc kubenswrapper[4713]: E0127 15:50:45.490076 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jvc6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-r4v5v_openshift-marketplace(8597977e-b4b9-4ad9-98ac-61187df61af5): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:50:45 crc kubenswrapper[4713]: E0127 15:50:45.493899 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-r4v5v" podUID="8597977e-b4b9-4ad9-98ac-61187df61af5" Jan 27 15:50:45 crc kubenswrapper[4713]: E0127 15:50:45.533109 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 27 15:50:45 crc kubenswrapper[4713]: E0127 15:50:45.533436 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gw2qx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-sndzr_openshift-marketplace(7fd376e7-8ed4-449c-941d-2276a631f20b): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:50:45 crc kubenswrapper[4713]: E0127 15:50:45.534636 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-sndzr" podUID="7fd376e7-8ed4-449c-941d-2276a631f20b" Jan 27 15:50:45 crc kubenswrapper[4713]: E0127 15:50:45.546564 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 27 15:50:45 crc kubenswrapper[4713]: E0127 15:50:45.546755 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-68fsg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-mcjdx_openshift-marketplace(a2ab03d0-a8a9-4588-91e6-48d50999c219): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:50:45 crc kubenswrapper[4713]: E0127 15:50:45.548018 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-mcjdx" podUID="a2ab03d0-a8a9-4588-91e6-48d50999c219" Jan 27 15:50:46 crc kubenswrapper[4713]: E0127 15:50:46.637975 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-mcjdx" podUID="a2ab03d0-a8a9-4588-91e6-48d50999c219" Jan 27 15:50:46 crc kubenswrapper[4713]: E0127 15:50:46.638299 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-r4v5v" podUID="8597977e-b4b9-4ad9-98ac-61187df61af5" Jan 27 15:50:46 crc kubenswrapper[4713]: E0127 15:50:46.639346 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-sndzr" podUID="7fd376e7-8ed4-449c-941d-2276a631f20b" Jan 27 15:50:46 crc kubenswrapper[4713]: E0127 15:50:46.797437 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 27 15:50:46 crc kubenswrapper[4713]: E0127 15:50:46.797882 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ml77s,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sjnst_openshift-marketplace(49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:50:46 crc kubenswrapper[4713]: E0127 15:50:46.799167 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-sjnst" podUID="49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1" Jan 27 15:50:46 crc kubenswrapper[4713]: E0127 15:50:46.874584 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 27 15:50:46 crc kubenswrapper[4713]: E0127 15:50:46.874775 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rfbbm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-dkjdv_openshift-marketplace(c8aca94a-14f9-473c-b913-be4fececbb18): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:50:46 crc kubenswrapper[4713]: E0127 15:50:46.876450 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-dkjdv" podUID="c8aca94a-14f9-473c-b913-be4fececbb18" Jan 27 15:50:46 crc kubenswrapper[4713]: I0127 15:50:46.983283 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-mdw5k"] Jan 27 15:50:46 crc kubenswrapper[4713]: W0127 15:50:46.997228 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81e4c47d_af8d_44f0_beff_17cf5f133ff7.slice/crio-69e6e679ab1536a563f501fd382300fd3414ecc5cd58b5640d11b94378270d05 WatchSource:0}: Error finding container 69e6e679ab1536a563f501fd382300fd3414ecc5cd58b5640d11b94378270d05: Status 404 returned error can't find the container with id 69e6e679ab1536a563f501fd382300fd3414ecc5cd58b5640d11b94378270d05 Jan 27 15:50:46 crc kubenswrapper[4713]: E0127 15:50:46.997953 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 27 15:50:46 crc kubenswrapper[4713]: E0127 15:50:46.998312 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v95tg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-55xhg_openshift-marketplace(7abaac35-62ca-4792-89e7-c9f32a551079): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:50:47 crc kubenswrapper[4713]: E0127 15:50:47.000225 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-55xhg" podUID="7abaac35-62ca-4792-89e7-c9f32a551079" Jan 27 15:50:47 crc kubenswrapper[4713]: I0127 15:50:47.029385 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 27 15:50:47 crc kubenswrapper[4713]: I0127 15:50:47.297721 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 27 15:50:47 crc kubenswrapper[4713]: W0127 15:50:47.307393 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod24b559b0_3afe_45d5_aa0a_23fa47ed20de.slice/crio-357df39e07277d86facabdd153598b0ac77732cec17dd4bf86fdd00ce788d08b WatchSource:0}: Error finding container 357df39e07277d86facabdd153598b0ac77732cec17dd4bf86fdd00ce788d08b: Status 404 returned error can't find the container with id 357df39e07277d86facabdd153598b0ac77732cec17dd4bf86fdd00ce788d08b Jan 27 15:50:47 crc kubenswrapper[4713]: I0127 15:50:47.525366 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-mkwg6" event={"ID":"01f5e6a0-654f-41cb-a694-2c86ca8523d8","Type":"ContainerStarted","Data":"c8ef39b96f326589e733250d89c86bbfcc0f9138de6e2ee5f8b08aa57a2284f3"} Jan 27 15:50:47 crc kubenswrapper[4713]: I0127 15:50:47.525680 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-mkwg6" Jan 27 15:50:47 crc kubenswrapper[4713]: I0127 15:50:47.526279 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:50:47 crc kubenswrapper[4713]: I0127 15:50:47.526390 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:50:47 crc kubenswrapper[4713]: I0127 15:50:47.527672 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"24b559b0-3afe-45d5-aa0a-23fa47ed20de","Type":"ContainerStarted","Data":"357df39e07277d86facabdd153598b0ac77732cec17dd4bf86fdd00ce788d08b"} Jan 27 15:50:47 crc kubenswrapper[4713]: I0127 15:50:47.530816 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" event={"ID":"81e4c47d-af8d-44f0-beff-17cf5f133ff7","Type":"ContainerStarted","Data":"731f856476dc252c0cccc12597e855099b8490b60a2aa6c76ca498b06ca2a5b3"} Jan 27 15:50:47 crc kubenswrapper[4713]: I0127 15:50:47.530880 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" event={"ID":"81e4c47d-af8d-44f0-beff-17cf5f133ff7","Type":"ContainerStarted","Data":"69e6e679ab1536a563f501fd382300fd3414ecc5cd58b5640d11b94378270d05"} Jan 27 15:50:47 crc kubenswrapper[4713]: I0127 15:50:47.536692 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerStarted","Data":"b4f76f2a78766e1dd9aa4ea8bd3b2724be1d207112925beb0d541bde19499bda"} Jan 27 15:50:47 crc kubenswrapper[4713]: I0127 15:50:47.542222 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3b10c0e8-a755-4134-a03c-4c84d3e05238","Type":"ContainerStarted","Data":"591e44fd633f88eb4daeeeaf0f44dd914dc7837dc204854267365805405cee98"} Jan 27 15:50:47 crc kubenswrapper[4713]: I0127 15:50:47.542288 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3b10c0e8-a755-4134-a03c-4c84d3e05238","Type":"ContainerStarted","Data":"a2d9483666d7fd5fd68c810817033eda831131995b0cf4826fb50a8096074140"} Jan 27 15:50:47 crc kubenswrapper[4713]: E0127 15:50:47.544684 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sjnst" podUID="49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1" Jan 27 15:50:47 crc kubenswrapper[4713]: E0127 15:50:47.545546 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-dkjdv" podUID="c8aca94a-14f9-473c-b913-be4fececbb18" Jan 27 15:50:47 crc kubenswrapper[4713]: E0127 15:50:47.547637 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-55xhg" podUID="7abaac35-62ca-4792-89e7-c9f32a551079" Jan 27 15:50:47 crc kubenswrapper[4713]: I0127 15:50:47.615093 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=18.615068245 podStartE2EDuration="18.615068245s" podCreationTimestamp="2026-01-27 15:50:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:50:47.611237424 +0000 UTC m=+215.389447362" watchObservedRunningTime="2026-01-27 15:50:47.615068245 +0000 UTC m=+215.393278183" Jan 27 15:50:48 crc kubenswrapper[4713]: I0127 15:50:48.550029 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-mdw5k" event={"ID":"81e4c47d-af8d-44f0-beff-17cf5f133ff7","Type":"ContainerStarted","Data":"9d95c02dac1bd6a8d97f52197a214503225db6a698512a404788824bdb789b85"} Jan 27 15:50:48 crc kubenswrapper[4713]: I0127 15:50:48.551628 4713 generic.go:334] "Generic (PLEG): container finished" podID="24b559b0-3afe-45d5-aa0a-23fa47ed20de" containerID="e8f3a6fcd327e4d867ad3cb2d4cbf3ec76c9a7a8a0a996d920acefcd885b3845" exitCode=0 Jan 27 15:50:48 crc kubenswrapper[4713]: I0127 15:50:48.552363 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"24b559b0-3afe-45d5-aa0a-23fa47ed20de","Type":"ContainerDied","Data":"e8f3a6fcd327e4d867ad3cb2d4cbf3ec76c9a7a8a0a996d920acefcd885b3845"} Jan 27 15:50:48 crc kubenswrapper[4713]: I0127 15:50:48.553281 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:50:48 crc kubenswrapper[4713]: I0127 15:50:48.553361 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:50:48 crc kubenswrapper[4713]: I0127 15:50:48.576706 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-mdw5k" podStartSLOduration=192.576681257 podStartE2EDuration="3m12.576681257s" podCreationTimestamp="2026-01-27 15:47:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:50:48.57333534 +0000 UTC m=+216.351545288" watchObservedRunningTime="2026-01-27 15:50:48.576681257 +0000 UTC m=+216.354891195" Jan 27 15:50:49 crc kubenswrapper[4713]: I0127 15:50:49.802723 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:50:49 crc kubenswrapper[4713]: I0127 15:50:49.860445 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/24b559b0-3afe-45d5-aa0a-23fa47ed20de-kubelet-dir\") pod \"24b559b0-3afe-45d5-aa0a-23fa47ed20de\" (UID: \"24b559b0-3afe-45d5-aa0a-23fa47ed20de\") " Jan 27 15:50:49 crc kubenswrapper[4713]: I0127 15:50:49.860555 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24b559b0-3afe-45d5-aa0a-23fa47ed20de-kube-api-access\") pod \"24b559b0-3afe-45d5-aa0a-23fa47ed20de\" (UID: \"24b559b0-3afe-45d5-aa0a-23fa47ed20de\") " Jan 27 15:50:49 crc kubenswrapper[4713]: I0127 15:50:49.860585 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/24b559b0-3afe-45d5-aa0a-23fa47ed20de-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "24b559b0-3afe-45d5-aa0a-23fa47ed20de" (UID: "24b559b0-3afe-45d5-aa0a-23fa47ed20de"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:50:49 crc kubenswrapper[4713]: I0127 15:50:49.860875 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/24b559b0-3afe-45d5-aa0a-23fa47ed20de-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:50:49 crc kubenswrapper[4713]: I0127 15:50:49.869833 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24b559b0-3afe-45d5-aa0a-23fa47ed20de-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "24b559b0-3afe-45d5-aa0a-23fa47ed20de" (UID: "24b559b0-3afe-45d5-aa0a-23fa47ed20de"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:50:49 crc kubenswrapper[4713]: I0127 15:50:49.962688 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/24b559b0-3afe-45d5-aa0a-23fa47ed20de-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:50:50 crc kubenswrapper[4713]: I0127 15:50:50.565568 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"24b559b0-3afe-45d5-aa0a-23fa47ed20de","Type":"ContainerDied","Data":"357df39e07277d86facabdd153598b0ac77732cec17dd4bf86fdd00ce788d08b"} Jan 27 15:50:50 crc kubenswrapper[4713]: I0127 15:50:50.565628 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="357df39e07277d86facabdd153598b0ac77732cec17dd4bf86fdd00ce788d08b" Jan 27 15:50:50 crc kubenswrapper[4713]: I0127 15:50:50.565655 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 27 15:50:53 crc kubenswrapper[4713]: I0127 15:50:53.258831 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:50:53 crc kubenswrapper[4713]: I0127 15:50:53.259368 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:50:53 crc kubenswrapper[4713]: I0127 15:50:53.258977 4713 patch_prober.go:28] interesting pod/downloads-7954f5f757-mkwg6 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" start-of-body= Jan 27 15:50:53 crc kubenswrapper[4713]: I0127 15:50:53.259459 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-mkwg6" podUID="01f5e6a0-654f-41cb-a694-2c86ca8523d8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.8:8080/\": dial tcp 10.217.0.8:8080: connect: connection refused" Jan 27 15:50:57 crc kubenswrapper[4713]: I0127 15:50:57.610865 4713 generic.go:334] "Generic (PLEG): container finished" podID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" containerID="97576d58bfa555d0554f4f35c95923a066fe981d4c2bedf782a47b8195b4f448" exitCode=0 Jan 27 15:50:57 crc kubenswrapper[4713]: I0127 15:50:57.610968 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2xs9" event={"ID":"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145","Type":"ContainerDied","Data":"97576d58bfa555d0554f4f35c95923a066fe981d4c2bedf782a47b8195b4f448"} Jan 27 15:50:58 crc kubenswrapper[4713]: I0127 15:50:58.624224 4713 generic.go:334] "Generic (PLEG): container finished" podID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" containerID="d9321a65dc2037e8f974766b27fe167aa1f0c3875dc70733805bba8f1646d744" exitCode=0 Jan 27 15:50:58 crc kubenswrapper[4713]: I0127 15:50:58.624305 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mc2x2" event={"ID":"7f6159db-0f3d-486f-9e71-fda6a1c18f1f","Type":"ContainerDied","Data":"d9321a65dc2037e8f974766b27fe167aa1f0c3875dc70733805bba8f1646d744"} Jan 27 15:50:58 crc kubenswrapper[4713]: I0127 15:50:58.631084 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2xs9" event={"ID":"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145","Type":"ContainerStarted","Data":"b4d85138dbae23d7b231a06088a7282b590d268eb95991b283035a6a6203334e"} Jan 27 15:50:58 crc kubenswrapper[4713]: I0127 15:50:58.669861 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-b2xs9" podStartSLOduration=4.132700431 podStartE2EDuration="1m17.669838016s" podCreationTimestamp="2026-01-27 15:49:41 +0000 UTC" firstStartedPulling="2026-01-27 15:49:44.496181806 +0000 UTC m=+152.274391744" lastFinishedPulling="2026-01-27 15:50:58.033319391 +0000 UTC m=+225.811529329" observedRunningTime="2026-01-27 15:50:58.668858808 +0000 UTC m=+226.447068776" watchObservedRunningTime="2026-01-27 15:50:58.669838016 +0000 UTC m=+226.448047954" Jan 27 15:50:59 crc kubenswrapper[4713]: I0127 15:50:59.639118 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcjdx" event={"ID":"a2ab03d0-a8a9-4588-91e6-48d50999c219","Type":"ContainerStarted","Data":"19d02d35fa09b6f9d5ba45026905f4e019f315922c3837248aa73facd844bd41"} Jan 27 15:51:00 crc kubenswrapper[4713]: I0127 15:51:00.647158 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mc2x2" event={"ID":"7f6159db-0f3d-486f-9e71-fda6a1c18f1f","Type":"ContainerStarted","Data":"edaaa77e67ca37c4b719ae059722d8999cead023af6a6e2bd538793abc0616e1"} Jan 27 15:51:00 crc kubenswrapper[4713]: I0127 15:51:00.649719 4713 generic.go:334] "Generic (PLEG): container finished" podID="a2ab03d0-a8a9-4588-91e6-48d50999c219" containerID="19d02d35fa09b6f9d5ba45026905f4e019f315922c3837248aa73facd844bd41" exitCode=0 Jan 27 15:51:00 crc kubenswrapper[4713]: I0127 15:51:00.649771 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcjdx" event={"ID":"a2ab03d0-a8a9-4588-91e6-48d50999c219","Type":"ContainerDied","Data":"19d02d35fa09b6f9d5ba45026905f4e019f315922c3837248aa73facd844bd41"} Jan 27 15:51:00 crc kubenswrapper[4713]: I0127 15:51:00.675900 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mc2x2" podStartSLOduration=4.795895412 podStartE2EDuration="1m19.675876897s" podCreationTimestamp="2026-01-27 15:49:41 +0000 UTC" firstStartedPulling="2026-01-27 15:49:44.448418058 +0000 UTC m=+152.226628006" lastFinishedPulling="2026-01-27 15:50:59.328399553 +0000 UTC m=+227.106609491" observedRunningTime="2026-01-27 15:51:00.67287359 +0000 UTC m=+228.451083548" watchObservedRunningTime="2026-01-27 15:51:00.675876897 +0000 UTC m=+228.454086855" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.480442 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5sdrm"] Jan 27 15:51:01 crc kubenswrapper[4713]: E0127 15:51:01.480981 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24b559b0-3afe-45d5-aa0a-23fa47ed20de" containerName="pruner" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.480995 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="24b559b0-3afe-45d5-aa0a-23fa47ed20de" containerName="pruner" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.481129 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="24b559b0-3afe-45d5-aa0a-23fa47ed20de" containerName="pruner" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.481570 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.491956 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5sdrm"] Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.556495 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a567d575-448c-4b55-8661-30eb4efdf317-registry-certificates\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.556543 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a567d575-448c-4b55-8661-30eb4efdf317-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.556572 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a567d575-448c-4b55-8661-30eb4efdf317-trusted-ca\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.556595 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a567d575-448c-4b55-8661-30eb4efdf317-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.556839 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a567d575-448c-4b55-8661-30eb4efdf317-bound-sa-token\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.556956 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a567d575-448c-4b55-8661-30eb4efdf317-registry-tls\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.557093 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.557188 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq926\" (UniqueName: \"kubernetes.io/projected/a567d575-448c-4b55-8661-30eb4efdf317-kube-api-access-tq926\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.586845 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.659705 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq926\" (UniqueName: \"kubernetes.io/projected/a567d575-448c-4b55-8661-30eb4efdf317-kube-api-access-tq926\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.659782 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a567d575-448c-4b55-8661-30eb4efdf317-registry-certificates\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.659801 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a567d575-448c-4b55-8661-30eb4efdf317-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.659841 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a567d575-448c-4b55-8661-30eb4efdf317-trusted-ca\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.659867 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a567d575-448c-4b55-8661-30eb4efdf317-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.659900 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a567d575-448c-4b55-8661-30eb4efdf317-bound-sa-token\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.659932 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a567d575-448c-4b55-8661-30eb4efdf317-registry-tls\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.670822 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a567d575-448c-4b55-8661-30eb4efdf317-ca-trust-extracted\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.672830 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a567d575-448c-4b55-8661-30eb4efdf317-registry-certificates\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.678748 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a567d575-448c-4b55-8661-30eb4efdf317-trusted-ca\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.685194 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a567d575-448c-4b55-8661-30eb4efdf317-registry-tls\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.686392 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a567d575-448c-4b55-8661-30eb4efdf317-installation-pull-secrets\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.735924 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a567d575-448c-4b55-8661-30eb4efdf317-bound-sa-token\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.737460 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq926\" (UniqueName: \"kubernetes.io/projected/a567d575-448c-4b55-8661-30eb4efdf317-kube-api-access-tq926\") pod \"image-registry-66df7c8f76-5sdrm\" (UID: \"a567d575-448c-4b55-8661-30eb4efdf317\") " pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.759907 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.759974 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.864775 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.989425 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:51:01 crc kubenswrapper[4713]: I0127 15:51:01.989897 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:51:02 crc kubenswrapper[4713]: I0127 15:51:02.131383 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-5sdrm"] Jan 27 15:51:02 crc kubenswrapper[4713]: W0127 15:51:02.137825 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda567d575_448c_4b55_8661_30eb4efdf317.slice/crio-d5e866d49ac71beb519711b9057455cab67b5d642346d46ea63365d47800bc1e WatchSource:0}: Error finding container d5e866d49ac71beb519711b9057455cab67b5d642346d46ea63365d47800bc1e: Status 404 returned error can't find the container with id d5e866d49ac71beb519711b9057455cab67b5d642346d46ea63365d47800bc1e Jan 27 15:51:02 crc kubenswrapper[4713]: I0127 15:51:02.270991 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:51:02 crc kubenswrapper[4713]: I0127 15:51:02.271694 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:51:02 crc kubenswrapper[4713]: I0127 15:51:02.663909 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" event={"ID":"a567d575-448c-4b55-8661-30eb4efdf317","Type":"ContainerStarted","Data":"d5e866d49ac71beb519711b9057455cab67b5d642346d46ea63365d47800bc1e"} Jan 27 15:51:02 crc kubenswrapper[4713]: I0127 15:51:02.666671 4713 generic.go:334] "Generic (PLEG): container finished" podID="7abaac35-62ca-4792-89e7-c9f32a551079" containerID="5524335a3b1ff0cf33304f55cb143b9cd24f11aeb9e4a326c0d73a5c6d9c547d" exitCode=0 Jan 27 15:51:02 crc kubenswrapper[4713]: I0127 15:51:02.666731 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55xhg" event={"ID":"7abaac35-62ca-4792-89e7-c9f32a551079","Type":"ContainerDied","Data":"5524335a3b1ff0cf33304f55cb143b9cd24f11aeb9e4a326c0d73a5c6d9c547d"} Jan 27 15:51:02 crc kubenswrapper[4713]: I0127 15:51:02.669964 4713 generic.go:334] "Generic (PLEG): container finished" podID="c8aca94a-14f9-473c-b913-be4fececbb18" containerID="ee8ff76bb8163e6b101138e5673cb5397952f7ae32fce29acf01b52386b57e11" exitCode=0 Jan 27 15:51:02 crc kubenswrapper[4713]: I0127 15:51:02.670582 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkjdv" event={"ID":"c8aca94a-14f9-473c-b913-be4fececbb18","Type":"ContainerDied","Data":"ee8ff76bb8163e6b101138e5673cb5397952f7ae32fce29acf01b52386b57e11"} Jan 27 15:51:03 crc kubenswrapper[4713]: I0127 15:51:03.283663 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-mkwg6" Jan 27 15:51:03 crc kubenswrapper[4713]: I0127 15:51:03.679090 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" event={"ID":"a567d575-448c-4b55-8661-30eb4efdf317","Type":"ContainerStarted","Data":"72720af7fd8fb3753ad9a42e7802fa139822eddaacadcf1f8cff4b27b20af7aa"} Jan 27 15:51:03 crc kubenswrapper[4713]: I0127 15:51:03.681156 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:03 crc kubenswrapper[4713]: I0127 15:51:03.705762 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" podStartSLOduration=2.7057194190000002 podStartE2EDuration="2.705719419s" podCreationTimestamp="2026-01-27 15:51:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:51:03.700654512 +0000 UTC m=+231.478864460" watchObservedRunningTime="2026-01-27 15:51:03.705719419 +0000 UTC m=+231.483929367" Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.710285 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t92fx"] Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.843386 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-55xhg"] Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.858476 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mc2x2"] Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.858895 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mc2x2" podUID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" containerName="registry-server" containerID="cri-o://edaaa77e67ca37c4b719ae059722d8999cead023af6a6e2bd538793abc0616e1" gracePeriod=30 Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.864636 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b2xs9"] Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.865021 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-b2xs9" podUID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" containerName="registry-server" containerID="cri-o://b4d85138dbae23d7b231a06088a7282b590d268eb95991b283035a6a6203334e" gracePeriod=30 Jan 27 15:51:08 crc kubenswrapper[4713]: E0127 15:51:08.874362 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="edaaa77e67ca37c4b719ae059722d8999cead023af6a6e2bd538793abc0616e1" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 15:51:08 crc kubenswrapper[4713]: E0127 15:51:08.876361 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="edaaa77e67ca37c4b719ae059722d8999cead023af6a6e2bd538793abc0616e1" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.876868 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sndzr"] Jan 27 15:51:08 crc kubenswrapper[4713]: E0127 15:51:08.881433 4713 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="edaaa77e67ca37c4b719ae059722d8999cead023af6a6e2bd538793abc0616e1" cmd=["grpc_health_probe","-addr=:50051"] Jan 27 15:51:08 crc kubenswrapper[4713]: E0127 15:51:08.881539 4713 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-marketplace/certified-operators-mc2x2" podUID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" containerName="registry-server" Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.894295 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hlk8z"] Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.894607 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" podUID="5ed86dba-52b7-4652-91c7-aea3c8def1fc" containerName="marketplace-operator" containerID="cri-o://c0568a5dc169c1bff5c7b3062193f10d576f81dfed8150e415c3b9b92905c9e9" gracePeriod=30 Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.923260 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkjdv"] Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.933520 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjnst"] Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.943092 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t4f7m"] Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.945142 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.976403 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mcjdx"] Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.981541 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t4f7m\" (UID: \"4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833\") " pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.981628 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d7bk4\" (UniqueName: \"kubernetes.io/projected/4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833-kube-api-access-d7bk4\") pod \"marketplace-operator-79b997595-t4f7m\" (UID: \"4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833\") " pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.981674 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t4f7m\" (UID: \"4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833\") " pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.982438 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t4f7m"] Jan 27 15:51:08 crc kubenswrapper[4713]: I0127 15:51:08.988091 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r4v5v"] Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.083582 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t4f7m\" (UID: \"4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833\") " pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.083681 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d7bk4\" (UniqueName: \"kubernetes.io/projected/4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833-kube-api-access-d7bk4\") pod \"marketplace-operator-79b997595-t4f7m\" (UID: \"4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833\") " pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.083728 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t4f7m\" (UID: \"4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833\") " pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.087253 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-t4f7m\" (UID: \"4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833\") " pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.091422 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-t4f7m\" (UID: \"4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833\") " pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.106122 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d7bk4\" (UniqueName: \"kubernetes.io/projected/4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833-kube-api-access-d7bk4\") pod \"marketplace-operator-79b997595-t4f7m\" (UID: \"4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833\") " pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.170733 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-b2xs9" podUID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" containerName="registry-server" probeResult="failure" output="" Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.275331 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.742279 4713 generic.go:334] "Generic (PLEG): container finished" podID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" containerID="edaaa77e67ca37c4b719ae059722d8999cead023af6a6e2bd538793abc0616e1" exitCode=0 Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.742347 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mc2x2" event={"ID":"7f6159db-0f3d-486f-9e71-fda6a1c18f1f","Type":"ContainerDied","Data":"edaaa77e67ca37c4b719ae059722d8999cead023af6a6e2bd538793abc0616e1"} Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.745727 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcjdx" event={"ID":"a2ab03d0-a8a9-4588-91e6-48d50999c219","Type":"ContainerStarted","Data":"3df1950d489d80a526ca55a3231f509cccca2902217fac429e6f4c311d9cce7e"} Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.746011 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-mcjdx" podUID="a2ab03d0-a8a9-4588-91e6-48d50999c219" containerName="registry-server" containerID="cri-o://3df1950d489d80a526ca55a3231f509cccca2902217fac429e6f4c311d9cce7e" gracePeriod=30 Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.749270 4713 generic.go:334] "Generic (PLEG): container finished" podID="5ed86dba-52b7-4652-91c7-aea3c8def1fc" containerID="c0568a5dc169c1bff5c7b3062193f10d576f81dfed8150e415c3b9b92905c9e9" exitCode=0 Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.749328 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" event={"ID":"5ed86dba-52b7-4652-91c7-aea3c8def1fc","Type":"ContainerDied","Data":"c0568a5dc169c1bff5c7b3062193f10d576f81dfed8150e415c3b9b92905c9e9"} Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.771242 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-mcjdx" podStartSLOduration=6.134274389 podStartE2EDuration="1m25.771218433s" podCreationTimestamp="2026-01-27 15:49:44 +0000 UTC" firstStartedPulling="2026-01-27 15:49:47.678124484 +0000 UTC m=+155.456334422" lastFinishedPulling="2026-01-27 15:51:07.315068528 +0000 UTC m=+235.093278466" observedRunningTime="2026-01-27 15:51:09.766532356 +0000 UTC m=+237.544742314" watchObservedRunningTime="2026-01-27 15:51:09.771218433 +0000 UTC m=+237.549428371" Jan 27 15:51:09 crc kubenswrapper[4713]: I0127 15:51:09.977397 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.103510 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85t4t\" (UniqueName: \"kubernetes.io/projected/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-kube-api-access-85t4t\") pod \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\" (UID: \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\") " Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.103627 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-catalog-content\") pod \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\" (UID: \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\") " Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.103773 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-utilities\") pod \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\" (UID: \"7f6159db-0f3d-486f-9e71-fda6a1c18f1f\") " Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.104723 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-utilities" (OuterVolumeSpecName: "utilities") pod "7f6159db-0f3d-486f-9e71-fda6a1c18f1f" (UID: "7f6159db-0f3d-486f-9e71-fda6a1c18f1f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.109816 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-kube-api-access-85t4t" (OuterVolumeSpecName: "kube-api-access-85t4t") pod "7f6159db-0f3d-486f-9e71-fda6a1c18f1f" (UID: "7f6159db-0f3d-486f-9e71-fda6a1c18f1f"). InnerVolumeSpecName "kube-api-access-85t4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.164225 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.205746 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ed86dba-52b7-4652-91c7-aea3c8def1fc-marketplace-trusted-ca\") pod \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\" (UID: \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\") " Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.205822 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5ed86dba-52b7-4652-91c7-aea3c8def1fc-marketplace-operator-metrics\") pod \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\" (UID: \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\") " Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.205968 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xc52l\" (UniqueName: \"kubernetes.io/projected/5ed86dba-52b7-4652-91c7-aea3c8def1fc-kube-api-access-xc52l\") pod \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\" (UID: \"5ed86dba-52b7-4652-91c7-aea3c8def1fc\") " Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.206267 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85t4t\" (UniqueName: \"kubernetes.io/projected/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-kube-api-access-85t4t\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.206292 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.206831 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5ed86dba-52b7-4652-91c7-aea3c8def1fc-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "5ed86dba-52b7-4652-91c7-aea3c8def1fc" (UID: "5ed86dba-52b7-4652-91c7-aea3c8def1fc"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.209566 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5ed86dba-52b7-4652-91c7-aea3c8def1fc-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "5ed86dba-52b7-4652-91c7-aea3c8def1fc" (UID: "5ed86dba-52b7-4652-91c7-aea3c8def1fc"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.210639 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ed86dba-52b7-4652-91c7-aea3c8def1fc-kube-api-access-xc52l" (OuterVolumeSpecName: "kube-api-access-xc52l") pod "5ed86dba-52b7-4652-91c7-aea3c8def1fc" (UID: "5ed86dba-52b7-4652-91c7-aea3c8def1fc"). InnerVolumeSpecName "kube-api-access-xc52l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.307804 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5ed86dba-52b7-4652-91c7-aea3c8def1fc-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.308057 4713 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5ed86dba-52b7-4652-91c7-aea3c8def1fc-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.308069 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xc52l\" (UniqueName: \"kubernetes.io/projected/5ed86dba-52b7-4652-91c7-aea3c8def1fc-kube-api-access-xc52l\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.759998 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" event={"ID":"5ed86dba-52b7-4652-91c7-aea3c8def1fc","Type":"ContainerDied","Data":"fa75e4ba9507b7ba25cb43485b2b66eef709c8c791b295ebf8670437cea23db5"} Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.760018 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-hlk8z" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.760246 4713 scope.go:117] "RemoveContainer" containerID="c0568a5dc169c1bff5c7b3062193f10d576f81dfed8150e415c3b9b92905c9e9" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.763928 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mc2x2" event={"ID":"7f6159db-0f3d-486f-9e71-fda6a1c18f1f","Type":"ContainerDied","Data":"9dfe35d070a78ced67f6726cee271369c3119b56cde83aaf4ac8a13153c98ad7"} Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.764101 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mc2x2" Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.766942 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2xs9" event={"ID":"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145","Type":"ContainerDied","Data":"b4d85138dbae23d7b231a06088a7282b590d268eb95991b283035a6a6203334e"} Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.766915 4713 generic.go:334] "Generic (PLEG): container finished" podID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" containerID="b4d85138dbae23d7b231a06088a7282b590d268eb95991b283035a6a6203334e" exitCode=0 Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.790905 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hlk8z"] Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.792847 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-hlk8z"] Jan 27 15:51:10 crc kubenswrapper[4713]: I0127 15:51:10.914169 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ed86dba-52b7-4652-91c7-aea3c8def1fc" path="/var/lib/kubelet/pods/5ed86dba-52b7-4652-91c7-aea3c8def1fc/volumes" Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.482373 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.528128 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-catalog-content\") pod \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\" (UID: \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\") " Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.528203 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-utilities\") pod \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\" (UID: \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\") " Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.528250 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-crl54\" (UniqueName: \"kubernetes.io/projected/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-kube-api-access-crl54\") pod \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\" (UID: \"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145\") " Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.529903 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-utilities" (OuterVolumeSpecName: "utilities") pod "9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" (UID: "9f1a4ce5-62ba-41c3-bad8-ee8a9054a145"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.535338 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-kube-api-access-crl54" (OuterVolumeSpecName: "kube-api-access-crl54") pod "9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" (UID: "9f1a4ce5-62ba-41c3-bad8-ee8a9054a145"). InnerVolumeSpecName "kube-api-access-crl54". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.584866 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" (UID: "9f1a4ce5-62ba-41c3-bad8-ee8a9054a145"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.629908 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.629942 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.629955 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-crl54\" (UniqueName: \"kubernetes.io/projected/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145-kube-api-access-crl54\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.774080 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-b2xs9" event={"ID":"9f1a4ce5-62ba-41c3-bad8-ee8a9054a145","Type":"ContainerDied","Data":"429fad458e6e534a2ac3ae5c1a60ed0fed6285cc4084d44475eafbd8a0d7ba82"} Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.774227 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-b2xs9" Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.779949 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mcjdx_a2ab03d0-a8a9-4588-91e6-48d50999c219/registry-server/0.log" Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.780900 4713 generic.go:334] "Generic (PLEG): container finished" podID="a2ab03d0-a8a9-4588-91e6-48d50999c219" containerID="3df1950d489d80a526ca55a3231f509cccca2902217fac429e6f4c311d9cce7e" exitCode=1 Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.780945 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcjdx" event={"ID":"a2ab03d0-a8a9-4588-91e6-48d50999c219","Type":"ContainerDied","Data":"3df1950d489d80a526ca55a3231f509cccca2902217fac429e6f4c311d9cce7e"} Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.805950 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-b2xs9"] Jan 27 15:51:11 crc kubenswrapper[4713]: I0127 15:51:11.809303 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-b2xs9"] Jan 27 15:51:12 crc kubenswrapper[4713]: I0127 15:51:12.910165 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" path="/var/lib/kubelet/pods/9f1a4ce5-62ba-41c3-bad8-ee8a9054a145/volumes" Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.019437 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mcjdx_a2ab03d0-a8a9-4588-91e6-48d50999c219/registry-server/0.log" Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.020663 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mcjdx" Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.054324 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2ab03d0-a8a9-4588-91e6-48d50999c219-utilities\") pod \"a2ab03d0-a8a9-4588-91e6-48d50999c219\" (UID: \"a2ab03d0-a8a9-4588-91e6-48d50999c219\") " Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.054387 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68fsg\" (UniqueName: \"kubernetes.io/projected/a2ab03d0-a8a9-4588-91e6-48d50999c219-kube-api-access-68fsg\") pod \"a2ab03d0-a8a9-4588-91e6-48d50999c219\" (UID: \"a2ab03d0-a8a9-4588-91e6-48d50999c219\") " Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.054479 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2ab03d0-a8a9-4588-91e6-48d50999c219-catalog-content\") pod \"a2ab03d0-a8a9-4588-91e6-48d50999c219\" (UID: \"a2ab03d0-a8a9-4588-91e6-48d50999c219\") " Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.060194 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2ab03d0-a8a9-4588-91e6-48d50999c219-utilities" (OuterVolumeSpecName: "utilities") pod "a2ab03d0-a8a9-4588-91e6-48d50999c219" (UID: "a2ab03d0-a8a9-4588-91e6-48d50999c219"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.107534 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2ab03d0-a8a9-4588-91e6-48d50999c219-kube-api-access-68fsg" (OuterVolumeSpecName: "kube-api-access-68fsg") pod "a2ab03d0-a8a9-4588-91e6-48d50999c219" (UID: "a2ab03d0-a8a9-4588-91e6-48d50999c219"). InnerVolumeSpecName "kube-api-access-68fsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.159982 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a2ab03d0-a8a9-4588-91e6-48d50999c219-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.160099 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68fsg\" (UniqueName: \"kubernetes.io/projected/a2ab03d0-a8a9-4588-91e6-48d50999c219-kube-api-access-68fsg\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.198198 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2ab03d0-a8a9-4588-91e6-48d50999c219-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a2ab03d0-a8a9-4588-91e6-48d50999c219" (UID: "a2ab03d0-a8a9-4588-91e6-48d50999c219"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.234805 4713 scope.go:117] "RemoveContainer" containerID="edaaa77e67ca37c4b719ae059722d8999cead023af6a6e2bd538793abc0616e1" Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.261352 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a2ab03d0-a8a9-4588-91e6-48d50999c219-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.794921 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-mcjdx_a2ab03d0-a8a9-4588-91e6-48d50999c219/registry-server/0.log" Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.795811 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-mcjdx" event={"ID":"a2ab03d0-a8a9-4588-91e6-48d50999c219","Type":"ContainerDied","Data":"9b0a6e8b24d95990ca45ee8194cc597ac4ad7344854c7a70d5d9bf7003a3d1a8"} Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.795930 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-mcjdx" Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.827701 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7f6159db-0f3d-486f-9e71-fda6a1c18f1f" (UID: "7f6159db-0f3d-486f-9e71-fda6a1c18f1f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.830885 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-mcjdx"] Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.836156 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-mcjdx"] Jan 27 15:51:13 crc kubenswrapper[4713]: I0127 15:51:13.872424 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f6159db-0f3d-486f-9e71-fda6a1c18f1f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:14 crc kubenswrapper[4713]: I0127 15:51:14.109971 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mc2x2"] Jan 27 15:51:14 crc kubenswrapper[4713]: I0127 15:51:14.110082 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mc2x2"] Jan 27 15:51:14 crc kubenswrapper[4713]: I0127 15:51:14.905912 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" path="/var/lib/kubelet/pods/7f6159db-0f3d-486f-9e71-fda6a1c18f1f/volumes" Jan 27 15:51:14 crc kubenswrapper[4713]: I0127 15:51:14.906605 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2ab03d0-a8a9-4588-91e6-48d50999c219" path="/var/lib/kubelet/pods/a2ab03d0-a8a9-4588-91e6-48d50999c219/volumes" Jan 27 15:51:15 crc kubenswrapper[4713]: I0127 15:51:15.601303 4713 scope.go:117] "RemoveContainer" containerID="d9321a65dc2037e8f974766b27fe167aa1f0c3875dc70733805bba8f1646d744" Jan 27 15:51:15 crc kubenswrapper[4713]: I0127 15:51:15.676217 4713 scope.go:117] "RemoveContainer" containerID="8285266abe8d1a9e06bf9f6164e24c07921352dedbf8d4d909b6322a92aa2d57" Jan 27 15:51:15 crc kubenswrapper[4713]: I0127 15:51:15.732013 4713 scope.go:117] "RemoveContainer" containerID="b4d85138dbae23d7b231a06088a7282b590d268eb95991b283035a6a6203334e" Jan 27 15:51:15 crc kubenswrapper[4713]: I0127 15:51:15.777415 4713 scope.go:117] "RemoveContainer" containerID="97576d58bfa555d0554f4f35c95923a066fe981d4c2bedf782a47b8195b4f448" Jan 27 15:51:15 crc kubenswrapper[4713]: I0127 15:51:15.823427 4713 scope.go:117] "RemoveContainer" containerID="549d55986d4e2662fad556087fa718e9cb476eaff423fa6de381cf6ae4de5e60" Jan 27 15:51:15 crc kubenswrapper[4713]: I0127 15:51:15.865740 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-t4f7m"] Jan 27 15:51:15 crc kubenswrapper[4713]: I0127 15:51:15.870571 4713 scope.go:117] "RemoveContainer" containerID="3df1950d489d80a526ca55a3231f509cccca2902217fac429e6f4c311d9cce7e" Jan 27 15:51:15 crc kubenswrapper[4713]: W0127 15:51:15.881077 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d53c0cc_4a2a_4b48_a2a8_dcf5e854f833.slice/crio-341d260815db62535d189435e2ea3b351b509d0e839715bde7cde86e7b754450 WatchSource:0}: Error finding container 341d260815db62535d189435e2ea3b351b509d0e839715bde7cde86e7b754450: Status 404 returned error can't find the container with id 341d260815db62535d189435e2ea3b351b509d0e839715bde7cde86e7b754450 Jan 27 15:51:15 crc kubenswrapper[4713]: I0127 15:51:15.911372 4713 scope.go:117] "RemoveContainer" containerID="19d02d35fa09b6f9d5ba45026905f4e019f315922c3837248aa73facd844bd41" Jan 27 15:51:15 crc kubenswrapper[4713]: I0127 15:51:15.935712 4713 scope.go:117] "RemoveContainer" containerID="86090809c9f42092fbf6d9e334a8bd97f1addfc76efaababe6e18e3156fa6869" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.834416 4713 generic.go:334] "Generic (PLEG): container finished" podID="7fd376e7-8ed4-449c-941d-2276a631f20b" containerID="bdb4267e9f49fbe3e204b27d4fce248f9434b2d703e864fc05c1e42d628c3e79" exitCode=0 Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.834497 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sndzr" event={"ID":"7fd376e7-8ed4-449c-941d-2276a631f20b","Type":"ContainerDied","Data":"bdb4267e9f49fbe3e204b27d4fce248f9434b2d703e864fc05c1e42d628c3e79"} Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.839130 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55xhg" event={"ID":"7abaac35-62ca-4792-89e7-c9f32a551079","Type":"ContainerStarted","Data":"c5e1c718667c67534a62710a5fa6b71f94c6bed885c21238700fb8cf4154b8fd"} Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.839311 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-55xhg" podUID="7abaac35-62ca-4792-89e7-c9f32a551079" containerName="registry-server" containerID="cri-o://c5e1c718667c67534a62710a5fa6b71f94c6bed885c21238700fb8cf4154b8fd" gracePeriod=30 Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.843002 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkjdv" event={"ID":"c8aca94a-14f9-473c-b913-be4fececbb18","Type":"ContainerStarted","Data":"412b5a9c630f71c5875bcbe958bd4435297ac812c3ea6ac3866079ece17a20c6"} Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.843172 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-dkjdv" podUID="c8aca94a-14f9-473c-b913-be4fececbb18" containerName="registry-server" containerID="cri-o://412b5a9c630f71c5875bcbe958bd4435297ac812c3ea6ac3866079ece17a20c6" gracePeriod=30 Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.846939 4713 generic.go:334] "Generic (PLEG): container finished" podID="8597977e-b4b9-4ad9-98ac-61187df61af5" containerID="821be719010d212605f858c146a83fd7aba6ca3c21938c01aa7642c87cc8f1f7" exitCode=0 Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.847090 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4v5v" event={"ID":"8597977e-b4b9-4ad9-98ac-61187df61af5","Type":"ContainerDied","Data":"821be719010d212605f858c146a83fd7aba6ca3c21938c01aa7642c87cc8f1f7"} Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.856620 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" event={"ID":"4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833","Type":"ContainerStarted","Data":"fe649e0b83633a45201cab1d243931c585d2c83b88672181c7f598063bdf7145"} Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.856671 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" event={"ID":"4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833","Type":"ContainerStarted","Data":"341d260815db62535d189435e2ea3b351b509d0e839715bde7cde86e7b754450"} Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.857237 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.864915 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.867893 4713 generic.go:334] "Generic (PLEG): container finished" podID="49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1" containerID="8bcd287848f20b14637fd2ddbd6c104cdffc3d4449556fed1e8f8121b18c592e" exitCode=0 Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.867940 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjnst" event={"ID":"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1","Type":"ContainerDied","Data":"8bcd287848f20b14637fd2ddbd6c104cdffc3d4449556fed1e8f8121b18c592e"} Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.909532 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-t4f7m" podStartSLOduration=8.909503071 podStartE2EDuration="8.909503071s" podCreationTimestamp="2026-01-27 15:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:51:16.90706302 +0000 UTC m=+244.685272958" watchObservedRunningTime="2026-01-27 15:51:16.909503071 +0000 UTC m=+244.687713009" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.931880 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dkjdv" podStartSLOduration=6.551888954 podStartE2EDuration="1m33.931857521s" podCreationTimestamp="2026-01-27 15:49:43 +0000 UTC" firstStartedPulling="2026-01-27 15:49:45.593498562 +0000 UTC m=+153.371708500" lastFinishedPulling="2026-01-27 15:51:12.973467129 +0000 UTC m=+240.751677067" observedRunningTime="2026-01-27 15:51:16.927137324 +0000 UTC m=+244.705347262" watchObservedRunningTime="2026-01-27 15:51:16.931857521 +0000 UTC m=+244.710067449" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.946463 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-w2fhp"] Jan 27 15:51:16 crc kubenswrapper[4713]: E0127 15:51:16.946736 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" containerName="extract-utilities" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.946750 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" containerName="extract-utilities" Jan 27 15:51:16 crc kubenswrapper[4713]: E0127 15:51:16.946763 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" containerName="extract-content" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.946770 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" containerName="extract-content" Jan 27 15:51:16 crc kubenswrapper[4713]: E0127 15:51:16.946780 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ab03d0-a8a9-4588-91e6-48d50999c219" containerName="registry-server" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.946786 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ab03d0-a8a9-4588-91e6-48d50999c219" containerName="registry-server" Jan 27 15:51:16 crc kubenswrapper[4713]: E0127 15:51:16.946798 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" containerName="registry-server" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.946806 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" containerName="registry-server" Jan 27 15:51:16 crc kubenswrapper[4713]: E0127 15:51:16.946817 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" containerName="extract-utilities" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.946823 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" containerName="extract-utilities" Jan 27 15:51:16 crc kubenswrapper[4713]: E0127 15:51:16.946832 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ed86dba-52b7-4652-91c7-aea3c8def1fc" containerName="marketplace-operator" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.946838 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ed86dba-52b7-4652-91c7-aea3c8def1fc" containerName="marketplace-operator" Jan 27 15:51:16 crc kubenswrapper[4713]: E0127 15:51:16.946847 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" containerName="registry-server" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.946853 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" containerName="registry-server" Jan 27 15:51:16 crc kubenswrapper[4713]: E0127 15:51:16.946861 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ab03d0-a8a9-4588-91e6-48d50999c219" containerName="extract-utilities" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.946867 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ab03d0-a8a9-4588-91e6-48d50999c219" containerName="extract-utilities" Jan 27 15:51:16 crc kubenswrapper[4713]: E0127 15:51:16.946876 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" containerName="extract-content" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.946883 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" containerName="extract-content" Jan 27 15:51:16 crc kubenswrapper[4713]: E0127 15:51:16.946891 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2ab03d0-a8a9-4588-91e6-48d50999c219" containerName="extract-content" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.946896 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2ab03d0-a8a9-4588-91e6-48d50999c219" containerName="extract-content" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.947008 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f6159db-0f3d-486f-9e71-fda6a1c18f1f" containerName="registry-server" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.947022 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2ab03d0-a8a9-4588-91e6-48d50999c219" containerName="registry-server" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.947029 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ed86dba-52b7-4652-91c7-aea3c8def1fc" containerName="marketplace-operator" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.947053 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f1a4ce5-62ba-41c3-bad8-ee8a9054a145" containerName="registry-server" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.947823 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.959318 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-55xhg" podStartSLOduration=4.866760501 podStartE2EDuration="1m35.959298439s" podCreationTimestamp="2026-01-27 15:49:41 +0000 UTC" firstStartedPulling="2026-01-27 15:49:44.544204401 +0000 UTC m=+152.322414329" lastFinishedPulling="2026-01-27 15:51:15.636742329 +0000 UTC m=+243.414952267" observedRunningTime="2026-01-27 15:51:16.95589552 +0000 UTC m=+244.734105458" watchObservedRunningTime="2026-01-27 15:51:16.959298439 +0000 UTC m=+244.737508377" Jan 27 15:51:16 crc kubenswrapper[4713]: I0127 15:51:16.967940 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w2fhp"] Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.015060 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5kc6\" (UniqueName: \"kubernetes.io/projected/ebb9c0b9-bd38-4496-990a-ab1d02cc792b-kube-api-access-s5kc6\") pod \"certified-operators-w2fhp\" (UID: \"ebb9c0b9-bd38-4496-990a-ab1d02cc792b\") " pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.015547 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb9c0b9-bd38-4496-990a-ab1d02cc792b-catalog-content\") pod \"certified-operators-w2fhp\" (UID: \"ebb9c0b9-bd38-4496-990a-ab1d02cc792b\") " pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.015623 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb9c0b9-bd38-4496-990a-ab1d02cc792b-utilities\") pod \"certified-operators-w2fhp\" (UID: \"ebb9c0b9-bd38-4496-990a-ab1d02cc792b\") " pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.117182 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb9c0b9-bd38-4496-990a-ab1d02cc792b-catalog-content\") pod \"certified-operators-w2fhp\" (UID: \"ebb9c0b9-bd38-4496-990a-ab1d02cc792b\") " pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.117257 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb9c0b9-bd38-4496-990a-ab1d02cc792b-utilities\") pod \"certified-operators-w2fhp\" (UID: \"ebb9c0b9-bd38-4496-990a-ab1d02cc792b\") " pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.117318 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5kc6\" (UniqueName: \"kubernetes.io/projected/ebb9c0b9-bd38-4496-990a-ab1d02cc792b-kube-api-access-s5kc6\") pod \"certified-operators-w2fhp\" (UID: \"ebb9c0b9-bd38-4496-990a-ab1d02cc792b\") " pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.119083 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ebb9c0b9-bd38-4496-990a-ab1d02cc792b-utilities\") pod \"certified-operators-w2fhp\" (UID: \"ebb9c0b9-bd38-4496-990a-ab1d02cc792b\") " pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.120946 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ebb9c0b9-bd38-4496-990a-ab1d02cc792b-catalog-content\") pod \"certified-operators-w2fhp\" (UID: \"ebb9c0b9-bd38-4496-990a-ab1d02cc792b\") " pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.150224 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5kc6\" (UniqueName: \"kubernetes.io/projected/ebb9c0b9-bd38-4496-990a-ab1d02cc792b-kube-api-access-s5kc6\") pod \"certified-operators-w2fhp\" (UID: \"ebb9c0b9-bd38-4496-990a-ab1d02cc792b\") " pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.183771 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sndzr" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.193092 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r4v5v" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.196025 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjnst" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.218134 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd376e7-8ed4-449c-941d-2276a631f20b-catalog-content\") pod \"7fd376e7-8ed4-449c-941d-2276a631f20b\" (UID: \"7fd376e7-8ed4-449c-941d-2276a631f20b\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.218190 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gw2qx\" (UniqueName: \"kubernetes.io/projected/7fd376e7-8ed4-449c-941d-2276a631f20b-kube-api-access-gw2qx\") pod \"7fd376e7-8ed4-449c-941d-2276a631f20b\" (UID: \"7fd376e7-8ed4-449c-941d-2276a631f20b\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.218293 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd376e7-8ed4-449c-941d-2276a631f20b-utilities\") pod \"7fd376e7-8ed4-449c-941d-2276a631f20b\" (UID: \"7fd376e7-8ed4-449c-941d-2276a631f20b\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.219484 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fd376e7-8ed4-449c-941d-2276a631f20b-utilities" (OuterVolumeSpecName: "utilities") pod "7fd376e7-8ed4-449c-941d-2276a631f20b" (UID: "7fd376e7-8ed4-449c-941d-2276a631f20b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.224817 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fd376e7-8ed4-449c-941d-2276a631f20b-kube-api-access-gw2qx" (OuterVolumeSpecName: "kube-api-access-gw2qx") pod "7fd376e7-8ed4-449c-941d-2276a631f20b" (UID: "7fd376e7-8ed4-449c-941d-2276a631f20b"). InnerVolumeSpecName "kube-api-access-gw2qx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.247016 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-55xhg_7abaac35-62ca-4792-89e7-c9f32a551079/registry-server/0.log" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.248841 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55xhg" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.294671 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.313085 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fd376e7-8ed4-449c-941d-2276a631f20b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7fd376e7-8ed4-449c-941d-2276a631f20b" (UID: "7fd376e7-8ed4-449c-941d-2276a631f20b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.319696 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7abaac35-62ca-4792-89e7-c9f32a551079-utilities\") pod \"7abaac35-62ca-4792-89e7-c9f32a551079\" (UID: \"7abaac35-62ca-4792-89e7-c9f32a551079\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.319784 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-utilities\") pod \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\" (UID: \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.319815 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml77s\" (UniqueName: \"kubernetes.io/projected/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-kube-api-access-ml77s\") pod \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\" (UID: \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.319846 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvc6g\" (UniqueName: \"kubernetes.io/projected/8597977e-b4b9-4ad9-98ac-61187df61af5-kube-api-access-jvc6g\") pod \"8597977e-b4b9-4ad9-98ac-61187df61af5\" (UID: \"8597977e-b4b9-4ad9-98ac-61187df61af5\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.319929 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8597977e-b4b9-4ad9-98ac-61187df61af5-catalog-content\") pod \"8597977e-b4b9-4ad9-98ac-61187df61af5\" (UID: \"8597977e-b4b9-4ad9-98ac-61187df61af5\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.319977 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v95tg\" (UniqueName: \"kubernetes.io/projected/7abaac35-62ca-4792-89e7-c9f32a551079-kube-api-access-v95tg\") pod \"7abaac35-62ca-4792-89e7-c9f32a551079\" (UID: \"7abaac35-62ca-4792-89e7-c9f32a551079\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.320025 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8597977e-b4b9-4ad9-98ac-61187df61af5-utilities\") pod \"8597977e-b4b9-4ad9-98ac-61187df61af5\" (UID: \"8597977e-b4b9-4ad9-98ac-61187df61af5\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.320080 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-catalog-content\") pod \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\" (UID: \"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.320102 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7abaac35-62ca-4792-89e7-c9f32a551079-catalog-content\") pod \"7abaac35-62ca-4792-89e7-c9f32a551079\" (UID: \"7abaac35-62ca-4792-89e7-c9f32a551079\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.320392 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7fd376e7-8ed4-449c-941d-2276a631f20b-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.320412 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7fd376e7-8ed4-449c-941d-2276a631f20b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.320427 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gw2qx\" (UniqueName: \"kubernetes.io/projected/7fd376e7-8ed4-449c-941d-2276a631f20b-kube-api-access-gw2qx\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.320694 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-utilities" (OuterVolumeSpecName: "utilities") pod "49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1" (UID: "49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.321635 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7abaac35-62ca-4792-89e7-c9f32a551079-utilities" (OuterVolumeSpecName: "utilities") pod "7abaac35-62ca-4792-89e7-c9f32a551079" (UID: "7abaac35-62ca-4792-89e7-c9f32a551079"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.330117 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8597977e-b4b9-4ad9-98ac-61187df61af5-utilities" (OuterVolumeSpecName: "utilities") pod "8597977e-b4b9-4ad9-98ac-61187df61af5" (UID: "8597977e-b4b9-4ad9-98ac-61187df61af5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.331977 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8597977e-b4b9-4ad9-98ac-61187df61af5-kube-api-access-jvc6g" (OuterVolumeSpecName: "kube-api-access-jvc6g") pod "8597977e-b4b9-4ad9-98ac-61187df61af5" (UID: "8597977e-b4b9-4ad9-98ac-61187df61af5"). InnerVolumeSpecName "kube-api-access-jvc6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.332684 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7abaac35-62ca-4792-89e7-c9f32a551079-kube-api-access-v95tg" (OuterVolumeSpecName: "kube-api-access-v95tg") pod "7abaac35-62ca-4792-89e7-c9f32a551079" (UID: "7abaac35-62ca-4792-89e7-c9f32a551079"). InnerVolumeSpecName "kube-api-access-v95tg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.336153 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-kube-api-access-ml77s" (OuterVolumeSpecName: "kube-api-access-ml77s") pod "49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1" (UID: "49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1"). InnerVolumeSpecName "kube-api-access-ml77s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.366620 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1" (UID: "49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.388020 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7abaac35-62ca-4792-89e7-c9f32a551079-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7abaac35-62ca-4792-89e7-c9f32a551079" (UID: "7abaac35-62ca-4792-89e7-c9f32a551079"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.421690 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8597977e-b4b9-4ad9-98ac-61187df61af5-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.421727 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.421744 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7abaac35-62ca-4792-89e7-c9f32a551079-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.421755 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7abaac35-62ca-4792-89e7-c9f32a551079-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.421767 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.421778 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml77s\" (UniqueName: \"kubernetes.io/projected/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1-kube-api-access-ml77s\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.421790 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvc6g\" (UniqueName: \"kubernetes.io/projected/8597977e-b4b9-4ad9-98ac-61187df61af5-kube-api-access-jvc6g\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.421834 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v95tg\" (UniqueName: \"kubernetes.io/projected/7abaac35-62ca-4792-89e7-c9f32a551079-kube-api-access-v95tg\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.487081 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-w2fhp"] Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.492511 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8597977e-b4b9-4ad9-98ac-61187df61af5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8597977e-b4b9-4ad9-98ac-61187df61af5" (UID: "8597977e-b4b9-4ad9-98ac-61187df61af5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.523637 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8597977e-b4b9-4ad9-98ac-61187df61af5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.805957 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dkjdv" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.874359 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2fhp" event={"ID":"ebb9c0b9-bd38-4496-990a-ab1d02cc792b","Type":"ContainerDied","Data":"74f0539e0710a15ca81a358729cea34b11c57a514bba2d9ef032b2b239552798"} Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.874166 4713 generic.go:334] "Generic (PLEG): container finished" podID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" containerID="74f0539e0710a15ca81a358729cea34b11c57a514bba2d9ef032b2b239552798" exitCode=0 Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.875180 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2fhp" event={"ID":"ebb9c0b9-bd38-4496-990a-ab1d02cc792b","Type":"ContainerStarted","Data":"e235194d700d9b84f0c21ad8fb3003a166ba2447364e23479c294ec095e0c47b"} Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.877878 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r4v5v" event={"ID":"8597977e-b4b9-4ad9-98ac-61187df61af5","Type":"ContainerDied","Data":"71e982eb5fec648320d3d344cb144d10b17ccb4115a727f109cb857ca89022b8"} Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.877925 4713 scope.go:117] "RemoveContainer" containerID="821be719010d212605f858c146a83fd7aba6ca3c21938c01aa7642c87cc8f1f7" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.878077 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r4v5v" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.882573 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sjnst" event={"ID":"49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1","Type":"ContainerDied","Data":"d0c6e8c3fa93fde547baaec8f7d219d897fcc3f66f34309def0ea40be0f167cd"} Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.882608 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sjnst" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.889849 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-sndzr" event={"ID":"7fd376e7-8ed4-449c-941d-2276a631f20b","Type":"ContainerDied","Data":"cb585da552a5d177b38d09168943c0267e9a47faddc3b568c60932933055bcd8"} Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.889868 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-sndzr" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.891280 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-55xhg_7abaac35-62ca-4792-89e7-c9f32a551079/registry-server/0.log" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.891812 4713 generic.go:334] "Generic (PLEG): container finished" podID="7abaac35-62ca-4792-89e7-c9f32a551079" containerID="c5e1c718667c67534a62710a5fa6b71f94c6bed885c21238700fb8cf4154b8fd" exitCode=1 Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.891882 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55xhg" event={"ID":"7abaac35-62ca-4792-89e7-c9f32a551079","Type":"ContainerDied","Data":"c5e1c718667c67534a62710a5fa6b71f94c6bed885c21238700fb8cf4154b8fd"} Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.891917 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-55xhg" event={"ID":"7abaac35-62ca-4792-89e7-c9f32a551079","Type":"ContainerDied","Data":"8517fba529d0830971338f6d426a0d60a83bd75a65e66956d76727649b3f5469"} Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.891992 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-55xhg" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.894887 4713 generic.go:334] "Generic (PLEG): container finished" podID="c8aca94a-14f9-473c-b913-be4fececbb18" containerID="412b5a9c630f71c5875bcbe958bd4435297ac812c3ea6ac3866079ece17a20c6" exitCode=0 Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.894918 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dkjdv" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.895160 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkjdv" event={"ID":"c8aca94a-14f9-473c-b913-be4fececbb18","Type":"ContainerDied","Data":"412b5a9c630f71c5875bcbe958bd4435297ac812c3ea6ac3866079ece17a20c6"} Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.895224 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dkjdv" event={"ID":"c8aca94a-14f9-473c-b913-be4fececbb18","Type":"ContainerDied","Data":"471589bee8e0b5212c9057cba07a05bc7a8d7d019e07a03d56f7c812a388fb02"} Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.915574 4713 scope.go:117] "RemoveContainer" containerID="70d466e7b15e205c50d6fc776d11e5c27f023b259253de035389bcf5cdba21d5" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.931987 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8aca94a-14f9-473c-b913-be4fececbb18-catalog-content\") pod \"c8aca94a-14f9-473c-b913-be4fececbb18\" (UID: \"c8aca94a-14f9-473c-b913-be4fececbb18\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.932076 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8aca94a-14f9-473c-b913-be4fececbb18-utilities\") pod \"c8aca94a-14f9-473c-b913-be4fececbb18\" (UID: \"c8aca94a-14f9-473c-b913-be4fececbb18\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.932117 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rfbbm\" (UniqueName: \"kubernetes.io/projected/c8aca94a-14f9-473c-b913-be4fececbb18-kube-api-access-rfbbm\") pod \"c8aca94a-14f9-473c-b913-be4fececbb18\" (UID: \"c8aca94a-14f9-473c-b913-be4fececbb18\") " Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.938821 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8aca94a-14f9-473c-b913-be4fececbb18-utilities" (OuterVolumeSpecName: "utilities") pod "c8aca94a-14f9-473c-b913-be4fececbb18" (UID: "c8aca94a-14f9-473c-b913-be4fececbb18"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.944495 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-55xhg"] Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.946480 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8aca94a-14f9-473c-b913-be4fececbb18-kube-api-access-rfbbm" (OuterVolumeSpecName: "kube-api-access-rfbbm") pod "c8aca94a-14f9-473c-b913-be4fececbb18" (UID: "c8aca94a-14f9-473c-b913-be4fececbb18"). InnerVolumeSpecName "kube-api-access-rfbbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.947804 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-55xhg"] Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.967305 4713 scope.go:117] "RemoveContainer" containerID="8bcd287848f20b14637fd2ddbd6c104cdffc3d4449556fed1e8f8121b18c592e" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.967960 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8aca94a-14f9-473c-b913-be4fececbb18-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c8aca94a-14f9-473c-b913-be4fececbb18" (UID: "c8aca94a-14f9-473c-b913-be4fececbb18"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.980489 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-r4v5v"] Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.995528 4713 scope.go:117] "RemoveContainer" containerID="67ae5dbfee5976259542752aa0fb489e0dd126a8db80fb1657b1bb620d3cc592" Jan 27 15:51:17 crc kubenswrapper[4713]: I0127 15:51:17.997608 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-r4v5v"] Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.020004 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-sndzr"] Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.022369 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-sndzr"] Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.025138 4713 scope.go:117] "RemoveContainer" containerID="bdb4267e9f49fbe3e204b27d4fce248f9434b2d703e864fc05c1e42d628c3e79" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.036157 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c8aca94a-14f9-473c-b913-be4fececbb18-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.036194 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c8aca94a-14f9-473c-b913-be4fececbb18-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.036209 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rfbbm\" (UniqueName: \"kubernetes.io/projected/c8aca94a-14f9-473c-b913-be4fececbb18-kube-api-access-rfbbm\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.043646 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjnst"] Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.046374 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sjnst"] Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.052367 4713 scope.go:117] "RemoveContainer" containerID="7530abd5c6392246e837c132a1687109ffe40beea3b38d05d4355a1e1d84ca07" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.070151 4713 scope.go:117] "RemoveContainer" containerID="c5e1c718667c67534a62710a5fa6b71f94c6bed885c21238700fb8cf4154b8fd" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.083802 4713 scope.go:117] "RemoveContainer" containerID="5524335a3b1ff0cf33304f55cb143b9cd24f11aeb9e4a326c0d73a5c6d9c547d" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.103088 4713 scope.go:117] "RemoveContainer" containerID="72eb234ea02c5a204cd2d6177c1c37075ba1420ff8a653862a4141890c33e13a" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.118492 4713 scope.go:117] "RemoveContainer" containerID="c5e1c718667c67534a62710a5fa6b71f94c6bed885c21238700fb8cf4154b8fd" Jan 27 15:51:18 crc kubenswrapper[4713]: E0127 15:51:18.119116 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5e1c718667c67534a62710a5fa6b71f94c6bed885c21238700fb8cf4154b8fd\": container with ID starting with c5e1c718667c67534a62710a5fa6b71f94c6bed885c21238700fb8cf4154b8fd not found: ID does not exist" containerID="c5e1c718667c67534a62710a5fa6b71f94c6bed885c21238700fb8cf4154b8fd" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.119184 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5e1c718667c67534a62710a5fa6b71f94c6bed885c21238700fb8cf4154b8fd"} err="failed to get container status \"c5e1c718667c67534a62710a5fa6b71f94c6bed885c21238700fb8cf4154b8fd\": rpc error: code = NotFound desc = could not find container \"c5e1c718667c67534a62710a5fa6b71f94c6bed885c21238700fb8cf4154b8fd\": container with ID starting with c5e1c718667c67534a62710a5fa6b71f94c6bed885c21238700fb8cf4154b8fd not found: ID does not exist" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.119226 4713 scope.go:117] "RemoveContainer" containerID="5524335a3b1ff0cf33304f55cb143b9cd24f11aeb9e4a326c0d73a5c6d9c547d" Jan 27 15:51:18 crc kubenswrapper[4713]: E0127 15:51:18.119566 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5524335a3b1ff0cf33304f55cb143b9cd24f11aeb9e4a326c0d73a5c6d9c547d\": container with ID starting with 5524335a3b1ff0cf33304f55cb143b9cd24f11aeb9e4a326c0d73a5c6d9c547d not found: ID does not exist" containerID="5524335a3b1ff0cf33304f55cb143b9cd24f11aeb9e4a326c0d73a5c6d9c547d" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.119600 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5524335a3b1ff0cf33304f55cb143b9cd24f11aeb9e4a326c0d73a5c6d9c547d"} err="failed to get container status \"5524335a3b1ff0cf33304f55cb143b9cd24f11aeb9e4a326c0d73a5c6d9c547d\": rpc error: code = NotFound desc = could not find container \"5524335a3b1ff0cf33304f55cb143b9cd24f11aeb9e4a326c0d73a5c6d9c547d\": container with ID starting with 5524335a3b1ff0cf33304f55cb143b9cd24f11aeb9e4a326c0d73a5c6d9c547d not found: ID does not exist" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.119622 4713 scope.go:117] "RemoveContainer" containerID="72eb234ea02c5a204cd2d6177c1c37075ba1420ff8a653862a4141890c33e13a" Jan 27 15:51:18 crc kubenswrapper[4713]: E0127 15:51:18.119864 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72eb234ea02c5a204cd2d6177c1c37075ba1420ff8a653862a4141890c33e13a\": container with ID starting with 72eb234ea02c5a204cd2d6177c1c37075ba1420ff8a653862a4141890c33e13a not found: ID does not exist" containerID="72eb234ea02c5a204cd2d6177c1c37075ba1420ff8a653862a4141890c33e13a" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.119890 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72eb234ea02c5a204cd2d6177c1c37075ba1420ff8a653862a4141890c33e13a"} err="failed to get container status \"72eb234ea02c5a204cd2d6177c1c37075ba1420ff8a653862a4141890c33e13a\": rpc error: code = NotFound desc = could not find container \"72eb234ea02c5a204cd2d6177c1c37075ba1420ff8a653862a4141890c33e13a\": container with ID starting with 72eb234ea02c5a204cd2d6177c1c37075ba1420ff8a653862a4141890c33e13a not found: ID does not exist" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.119907 4713 scope.go:117] "RemoveContainer" containerID="412b5a9c630f71c5875bcbe958bd4435297ac812c3ea6ac3866079ece17a20c6" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.132337 4713 scope.go:117] "RemoveContainer" containerID="ee8ff76bb8163e6b101138e5673cb5397952f7ae32fce29acf01b52386b57e11" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.147159 4713 scope.go:117] "RemoveContainer" containerID="de6a286a59bb52be72786fb6aa3409ce560521af1af6dcaa0b947a9bf257cd08" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.164507 4713 scope.go:117] "RemoveContainer" containerID="412b5a9c630f71c5875bcbe958bd4435297ac812c3ea6ac3866079ece17a20c6" Jan 27 15:51:18 crc kubenswrapper[4713]: E0127 15:51:18.165031 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"412b5a9c630f71c5875bcbe958bd4435297ac812c3ea6ac3866079ece17a20c6\": container with ID starting with 412b5a9c630f71c5875bcbe958bd4435297ac812c3ea6ac3866079ece17a20c6 not found: ID does not exist" containerID="412b5a9c630f71c5875bcbe958bd4435297ac812c3ea6ac3866079ece17a20c6" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.165087 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"412b5a9c630f71c5875bcbe958bd4435297ac812c3ea6ac3866079ece17a20c6"} err="failed to get container status \"412b5a9c630f71c5875bcbe958bd4435297ac812c3ea6ac3866079ece17a20c6\": rpc error: code = NotFound desc = could not find container \"412b5a9c630f71c5875bcbe958bd4435297ac812c3ea6ac3866079ece17a20c6\": container with ID starting with 412b5a9c630f71c5875bcbe958bd4435297ac812c3ea6ac3866079ece17a20c6 not found: ID does not exist" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.165121 4713 scope.go:117] "RemoveContainer" containerID="ee8ff76bb8163e6b101138e5673cb5397952f7ae32fce29acf01b52386b57e11" Jan 27 15:51:18 crc kubenswrapper[4713]: E0127 15:51:18.165670 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee8ff76bb8163e6b101138e5673cb5397952f7ae32fce29acf01b52386b57e11\": container with ID starting with ee8ff76bb8163e6b101138e5673cb5397952f7ae32fce29acf01b52386b57e11 not found: ID does not exist" containerID="ee8ff76bb8163e6b101138e5673cb5397952f7ae32fce29acf01b52386b57e11" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.165727 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee8ff76bb8163e6b101138e5673cb5397952f7ae32fce29acf01b52386b57e11"} err="failed to get container status \"ee8ff76bb8163e6b101138e5673cb5397952f7ae32fce29acf01b52386b57e11\": rpc error: code = NotFound desc = could not find container \"ee8ff76bb8163e6b101138e5673cb5397952f7ae32fce29acf01b52386b57e11\": container with ID starting with ee8ff76bb8163e6b101138e5673cb5397952f7ae32fce29acf01b52386b57e11 not found: ID does not exist" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.165750 4713 scope.go:117] "RemoveContainer" containerID="de6a286a59bb52be72786fb6aa3409ce560521af1af6dcaa0b947a9bf257cd08" Jan 27 15:51:18 crc kubenswrapper[4713]: E0127 15:51:18.166168 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de6a286a59bb52be72786fb6aa3409ce560521af1af6dcaa0b947a9bf257cd08\": container with ID starting with de6a286a59bb52be72786fb6aa3409ce560521af1af6dcaa0b947a9bf257cd08 not found: ID does not exist" containerID="de6a286a59bb52be72786fb6aa3409ce560521af1af6dcaa0b947a9bf257cd08" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.166205 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de6a286a59bb52be72786fb6aa3409ce560521af1af6dcaa0b947a9bf257cd08"} err="failed to get container status \"de6a286a59bb52be72786fb6aa3409ce560521af1af6dcaa0b947a9bf257cd08\": rpc error: code = NotFound desc = could not find container \"de6a286a59bb52be72786fb6aa3409ce560521af1af6dcaa0b947a9bf257cd08\": container with ID starting with de6a286a59bb52be72786fb6aa3409ce560521af1af6dcaa0b947a9bf257cd08 not found: ID does not exist" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.230010 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkjdv"] Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.235866 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-dkjdv"] Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.908184 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1" path="/var/lib/kubelet/pods/49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1/volumes" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.908890 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7abaac35-62ca-4792-89e7-c9f32a551079" path="/var/lib/kubelet/pods/7abaac35-62ca-4792-89e7-c9f32a551079/volumes" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.910842 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fd376e7-8ed4-449c-941d-2276a631f20b" path="/var/lib/kubelet/pods/7fd376e7-8ed4-449c-941d-2276a631f20b/volumes" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.912085 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8597977e-b4b9-4ad9-98ac-61187df61af5" path="/var/lib/kubelet/pods/8597977e-b4b9-4ad9-98ac-61187df61af5/volumes" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.914596 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8aca94a-14f9-473c-b913-be4fececbb18" path="/var/lib/kubelet/pods/c8aca94a-14f9-473c-b913-be4fececbb18/volumes" Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.932298 4713 generic.go:334] "Generic (PLEG): container finished" podID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" containerID="17176e814f022d551315a0e0b719edc45914fb1cc7afe61e0a8ea9601f167110" exitCode=0 Jan 27 15:51:18 crc kubenswrapper[4713]: I0127 15:51:18.933816 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2fhp" event={"ID":"ebb9c0b9-bd38-4496-990a-ab1d02cc792b","Type":"ContainerDied","Data":"17176e814f022d551315a0e0b719edc45914fb1cc7afe61e0a8ea9601f167110"} Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.340855 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fkchc"] Jan 27 15:51:19 crc kubenswrapper[4713]: E0127 15:51:19.341097 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1" containerName="extract-content" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341111 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1" containerName="extract-content" Jan 27 15:51:19 crc kubenswrapper[4713]: E0127 15:51:19.341122 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd376e7-8ed4-449c-941d-2276a631f20b" containerName="extract-content" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341129 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd376e7-8ed4-449c-941d-2276a631f20b" containerName="extract-content" Jan 27 15:51:19 crc kubenswrapper[4713]: E0127 15:51:19.341140 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8aca94a-14f9-473c-b913-be4fececbb18" containerName="extract-content" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341146 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8aca94a-14f9-473c-b913-be4fececbb18" containerName="extract-content" Jan 27 15:51:19 crc kubenswrapper[4713]: E0127 15:51:19.341154 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fd376e7-8ed4-449c-941d-2276a631f20b" containerName="extract-utilities" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341161 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fd376e7-8ed4-449c-941d-2276a631f20b" containerName="extract-utilities" Jan 27 15:51:19 crc kubenswrapper[4713]: E0127 15:51:19.341168 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8597977e-b4b9-4ad9-98ac-61187df61af5" containerName="extract-content" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341173 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8597977e-b4b9-4ad9-98ac-61187df61af5" containerName="extract-content" Jan 27 15:51:19 crc kubenswrapper[4713]: E0127 15:51:19.341180 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8aca94a-14f9-473c-b913-be4fececbb18" containerName="registry-server" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341186 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8aca94a-14f9-473c-b913-be4fececbb18" containerName="registry-server" Jan 27 15:51:19 crc kubenswrapper[4713]: E0127 15:51:19.341198 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abaac35-62ca-4792-89e7-c9f32a551079" containerName="registry-server" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341205 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abaac35-62ca-4792-89e7-c9f32a551079" containerName="registry-server" Jan 27 15:51:19 crc kubenswrapper[4713]: E0127 15:51:19.341217 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abaac35-62ca-4792-89e7-c9f32a551079" containerName="extract-utilities" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341224 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abaac35-62ca-4792-89e7-c9f32a551079" containerName="extract-utilities" Jan 27 15:51:19 crc kubenswrapper[4713]: E0127 15:51:19.341240 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8597977e-b4b9-4ad9-98ac-61187df61af5" containerName="extract-utilities" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341247 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8597977e-b4b9-4ad9-98ac-61187df61af5" containerName="extract-utilities" Jan 27 15:51:19 crc kubenswrapper[4713]: E0127 15:51:19.341254 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8aca94a-14f9-473c-b913-be4fececbb18" containerName="extract-utilities" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341261 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8aca94a-14f9-473c-b913-be4fececbb18" containerName="extract-utilities" Jan 27 15:51:19 crc kubenswrapper[4713]: E0127 15:51:19.341271 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7abaac35-62ca-4792-89e7-c9f32a551079" containerName="extract-content" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341277 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7abaac35-62ca-4792-89e7-c9f32a551079" containerName="extract-content" Jan 27 15:51:19 crc kubenswrapper[4713]: E0127 15:51:19.341287 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1" containerName="extract-utilities" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341293 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1" containerName="extract-utilities" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341384 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="49538a8e-50b8-45b6-9a4c-3ca6aa0d6dd1" containerName="extract-content" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341394 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8aca94a-14f9-473c-b913-be4fececbb18" containerName="registry-server" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341402 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8597977e-b4b9-4ad9-98ac-61187df61af5" containerName="extract-content" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341409 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fd376e7-8ed4-449c-941d-2276a631f20b" containerName="extract-content" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.341416 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7abaac35-62ca-4792-89e7-c9f32a551079" containerName="registry-server" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.342186 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.347463 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.351968 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fkchc"] Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.464445 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd505900-1549-404e-acfa-948789f8372d-catalog-content\") pod \"redhat-operators-fkchc\" (UID: \"dd505900-1549-404e-acfa-948789f8372d\") " pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.464843 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd505900-1549-404e-acfa-948789f8372d-utilities\") pod \"redhat-operators-fkchc\" (UID: \"dd505900-1549-404e-acfa-948789f8372d\") " pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.465091 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s72dj\" (UniqueName: \"kubernetes.io/projected/dd505900-1549-404e-acfa-948789f8372d-kube-api-access-s72dj\") pod \"redhat-operators-fkchc\" (UID: \"dd505900-1549-404e-acfa-948789f8372d\") " pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.549246 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jclch"] Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.550394 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.554697 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.563948 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jclch"] Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.567601 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd505900-1549-404e-acfa-948789f8372d-utilities\") pod \"redhat-operators-fkchc\" (UID: \"dd505900-1549-404e-acfa-948789f8372d\") " pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.567895 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s72dj\" (UniqueName: \"kubernetes.io/projected/dd505900-1549-404e-acfa-948789f8372d-kube-api-access-s72dj\") pod \"redhat-operators-fkchc\" (UID: \"dd505900-1549-404e-acfa-948789f8372d\") " pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.567977 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd505900-1549-404e-acfa-948789f8372d-catalog-content\") pod \"redhat-operators-fkchc\" (UID: \"dd505900-1549-404e-acfa-948789f8372d\") " pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.568479 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/dd505900-1549-404e-acfa-948789f8372d-utilities\") pod \"redhat-operators-fkchc\" (UID: \"dd505900-1549-404e-acfa-948789f8372d\") " pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.568580 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/dd505900-1549-404e-acfa-948789f8372d-catalog-content\") pod \"redhat-operators-fkchc\" (UID: \"dd505900-1549-404e-acfa-948789f8372d\") " pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.595111 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s72dj\" (UniqueName: \"kubernetes.io/projected/dd505900-1549-404e-acfa-948789f8372d-kube-api-access-s72dj\") pod \"redhat-operators-fkchc\" (UID: \"dd505900-1549-404e-acfa-948789f8372d\") " pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.660881 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.669646 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkxww\" (UniqueName: \"kubernetes.io/projected/9a8987a5-59c3-478b-8b6a-1f2712f6a6f8-kube-api-access-fkxww\") pod \"community-operators-jclch\" (UID: \"9a8987a5-59c3-478b-8b6a-1f2712f6a6f8\") " pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.669719 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a8987a5-59c3-478b-8b6a-1f2712f6a6f8-utilities\") pod \"community-operators-jclch\" (UID: \"9a8987a5-59c3-478b-8b6a-1f2712f6a6f8\") " pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.669809 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a8987a5-59c3-478b-8b6a-1f2712f6a6f8-catalog-content\") pod \"community-operators-jclch\" (UID: \"9a8987a5-59c3-478b-8b6a-1f2712f6a6f8\") " pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.775707 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a8987a5-59c3-478b-8b6a-1f2712f6a6f8-catalog-content\") pod \"community-operators-jclch\" (UID: \"9a8987a5-59c3-478b-8b6a-1f2712f6a6f8\") " pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.772008 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/9a8987a5-59c3-478b-8b6a-1f2712f6a6f8-catalog-content\") pod \"community-operators-jclch\" (UID: \"9a8987a5-59c3-478b-8b6a-1f2712f6a6f8\") " pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.776117 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkxww\" (UniqueName: \"kubernetes.io/projected/9a8987a5-59c3-478b-8b6a-1f2712f6a6f8-kube-api-access-fkxww\") pod \"community-operators-jclch\" (UID: \"9a8987a5-59c3-478b-8b6a-1f2712f6a6f8\") " pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.776225 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a8987a5-59c3-478b-8b6a-1f2712f6a6f8-utilities\") pod \"community-operators-jclch\" (UID: \"9a8987a5-59c3-478b-8b6a-1f2712f6a6f8\") " pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.776610 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/9a8987a5-59c3-478b-8b6a-1f2712f6a6f8-utilities\") pod \"community-operators-jclch\" (UID: \"9a8987a5-59c3-478b-8b6a-1f2712f6a6f8\") " pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.802553 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkxww\" (UniqueName: \"kubernetes.io/projected/9a8987a5-59c3-478b-8b6a-1f2712f6a6f8-kube-api-access-fkxww\") pod \"community-operators-jclch\" (UID: \"9a8987a5-59c3-478b-8b6a-1f2712f6a6f8\") " pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.872103 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:19 crc kubenswrapper[4713]: I0127 15:51:19.940358 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-w2fhp" event={"ID":"ebb9c0b9-bd38-4496-990a-ab1d02cc792b","Type":"ContainerStarted","Data":"dd8fcb3752766536948cbc67e9720e6dde99520c28ea3afc3a15c870d6d8cf83"} Jan 27 15:51:20 crc kubenswrapper[4713]: I0127 15:51:20.067563 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-w2fhp" podStartSLOduration=2.357339785 podStartE2EDuration="4.067536161s" podCreationTimestamp="2026-01-27 15:51:16 +0000 UTC" firstStartedPulling="2026-01-27 15:51:17.876180309 +0000 UTC m=+245.654390247" lastFinishedPulling="2026-01-27 15:51:19.586376685 +0000 UTC m=+247.364586623" observedRunningTime="2026-01-27 15:51:19.960383445 +0000 UTC m=+247.738593383" watchObservedRunningTime="2026-01-27 15:51:20.067536161 +0000 UTC m=+247.845746099" Jan 27 15:51:20 crc kubenswrapper[4713]: I0127 15:51:20.070212 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jclch"] Jan 27 15:51:20 crc kubenswrapper[4713]: W0127 15:51:20.076715 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a8987a5_59c3_478b_8b6a_1f2712f6a6f8.slice/crio-bc97ae6c9342e24380966b7001a78f0a19549c7764bc93faac2a1e9d56ef8cca WatchSource:0}: Error finding container bc97ae6c9342e24380966b7001a78f0a19549c7764bc93faac2a1e9d56ef8cca: Status 404 returned error can't find the container with id bc97ae6c9342e24380966b7001a78f0a19549c7764bc93faac2a1e9d56ef8cca Jan 27 15:51:20 crc kubenswrapper[4713]: I0127 15:51:20.144307 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fkchc"] Jan 27 15:51:20 crc kubenswrapper[4713]: W0127 15:51:20.149234 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd505900_1549_404e_acfa_948789f8372d.slice/crio-ea7c6ad58ba95d59ffdf7a34466768ee2dea2c686adc1d1eba280a9c037c45cb WatchSource:0}: Error finding container ea7c6ad58ba95d59ffdf7a34466768ee2dea2c686adc1d1eba280a9c037c45cb: Status 404 returned error can't find the container with id ea7c6ad58ba95d59ffdf7a34466768ee2dea2c686adc1d1eba280a9c037c45cb Jan 27 15:51:20 crc kubenswrapper[4713]: I0127 15:51:20.949790 4713 generic.go:334] "Generic (PLEG): container finished" podID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" containerID="289e6f2ca14b4f5d61c6eb0bf7b732eeb9224a7e1617b35c5b598a2027363f31" exitCode=0 Jan 27 15:51:20 crc kubenswrapper[4713]: I0127 15:51:20.950208 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jclch" event={"ID":"9a8987a5-59c3-478b-8b6a-1f2712f6a6f8","Type":"ContainerDied","Data":"289e6f2ca14b4f5d61c6eb0bf7b732eeb9224a7e1617b35c5b598a2027363f31"} Jan 27 15:51:20 crc kubenswrapper[4713]: I0127 15:51:20.950245 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jclch" event={"ID":"9a8987a5-59c3-478b-8b6a-1f2712f6a6f8","Type":"ContainerStarted","Data":"bc97ae6c9342e24380966b7001a78f0a19549c7764bc93faac2a1e9d56ef8cca"} Jan 27 15:51:20 crc kubenswrapper[4713]: I0127 15:51:20.952794 4713 generic.go:334] "Generic (PLEG): container finished" podID="dd505900-1549-404e-acfa-948789f8372d" containerID="dc06a2bcf9e6f82e680cb6e69e12a5bdaf17a6ab23c3a599e8373e691d691012" exitCode=0 Jan 27 15:51:20 crc kubenswrapper[4713]: I0127 15:51:20.954253 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fkchc" event={"ID":"dd505900-1549-404e-acfa-948789f8372d","Type":"ContainerDied","Data":"dc06a2bcf9e6f82e680cb6e69e12a5bdaf17a6ab23c3a599e8373e691d691012"} Jan 27 15:51:20 crc kubenswrapper[4713]: I0127 15:51:20.954293 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fkchc" event={"ID":"dd505900-1549-404e-acfa-948789f8372d","Type":"ContainerStarted","Data":"ea7c6ad58ba95d59ffdf7a34466768ee2dea2c686adc1d1eba280a9c037c45cb"} Jan 27 15:51:21 crc kubenswrapper[4713]: I0127 15:51:21.872956 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-5sdrm" Jan 27 15:51:21 crc kubenswrapper[4713]: I0127 15:51:21.944182 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-jjr22"] Jan 27 15:51:21 crc kubenswrapper[4713]: I0127 15:51:21.948835 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rwtsb"] Jan 27 15:51:21 crc kubenswrapper[4713]: I0127 15:51:21.949989 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:21 crc kubenswrapper[4713]: I0127 15:51:21.968047 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 15:51:21 crc kubenswrapper[4713]: I0127 15:51:21.971598 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwtsb"] Jan 27 15:51:22 crc kubenswrapper[4713]: I0127 15:51:22.005271 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jclch" event={"ID":"9a8987a5-59c3-478b-8b6a-1f2712f6a6f8","Type":"ContainerStarted","Data":"79796932a81ef354d965dad35d54512017b691afe1c2cbfcb147fe683875a480"} Jan 27 15:51:22 crc kubenswrapper[4713]: I0127 15:51:22.006480 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6nrd\" (UniqueName: \"kubernetes.io/projected/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-kube-api-access-t6nrd\") pod \"redhat-marketplace-rwtsb\" (UID: \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\") " pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:22 crc kubenswrapper[4713]: I0127 15:51:22.006545 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-utilities\") pod \"redhat-marketplace-rwtsb\" (UID: \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\") " pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:22 crc kubenswrapper[4713]: I0127 15:51:22.006599 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-catalog-content\") pod \"redhat-marketplace-rwtsb\" (UID: \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\") " pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:22 crc kubenswrapper[4713]: I0127 15:51:22.125332 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6nrd\" (UniqueName: \"kubernetes.io/projected/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-kube-api-access-t6nrd\") pod \"redhat-marketplace-rwtsb\" (UID: \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\") " pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:22 crc kubenswrapper[4713]: I0127 15:51:22.125417 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-utilities\") pod \"redhat-marketplace-rwtsb\" (UID: \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\") " pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:22 crc kubenswrapper[4713]: I0127 15:51:22.125459 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-catalog-content\") pod \"redhat-marketplace-rwtsb\" (UID: \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\") " pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:22 crc kubenswrapper[4713]: I0127 15:51:22.126022 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-catalog-content\") pod \"redhat-marketplace-rwtsb\" (UID: \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\") " pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:22 crc kubenswrapper[4713]: I0127 15:51:22.126443 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-utilities\") pod \"redhat-marketplace-rwtsb\" (UID: \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\") " pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:22 crc kubenswrapper[4713]: I0127 15:51:22.158687 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6nrd\" (UniqueName: \"kubernetes.io/projected/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-kube-api-access-t6nrd\") pod \"redhat-marketplace-rwtsb\" (UID: \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\") " pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:22 crc kubenswrapper[4713]: I0127 15:51:22.302011 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:22 crc kubenswrapper[4713]: I0127 15:51:22.520691 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwtsb"] Jan 27 15:51:23 crc kubenswrapper[4713]: I0127 15:51:23.018145 4713 generic.go:334] "Generic (PLEG): container finished" podID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" containerID="6f04baf981f2a08d570dc6de725b7ccecfbfa9e098122ad9ce4342ea4925ede1" exitCode=0 Jan 27 15:51:23 crc kubenswrapper[4713]: I0127 15:51:23.018704 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwtsb" event={"ID":"c255f079-b4c8-4c29-8e77-28e56c9a9ecf","Type":"ContainerDied","Data":"6f04baf981f2a08d570dc6de725b7ccecfbfa9e098122ad9ce4342ea4925ede1"} Jan 27 15:51:23 crc kubenswrapper[4713]: I0127 15:51:23.018761 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwtsb" event={"ID":"c255f079-b4c8-4c29-8e77-28e56c9a9ecf","Type":"ContainerStarted","Data":"0357391f3717f9272955b2eae88630ae9751d3f8ff45378ea451fd31e4386759"} Jan 27 15:51:23 crc kubenswrapper[4713]: I0127 15:51:23.023907 4713 generic.go:334] "Generic (PLEG): container finished" podID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" containerID="79796932a81ef354d965dad35d54512017b691afe1c2cbfcb147fe683875a480" exitCode=0 Jan 27 15:51:23 crc kubenswrapper[4713]: I0127 15:51:23.024056 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jclch" event={"ID":"9a8987a5-59c3-478b-8b6a-1f2712f6a6f8","Type":"ContainerDied","Data":"79796932a81ef354d965dad35d54512017b691afe1c2cbfcb147fe683875a480"} Jan 27 15:51:23 crc kubenswrapper[4713]: I0127 15:51:23.026827 4713 generic.go:334] "Generic (PLEG): container finished" podID="dd505900-1549-404e-acfa-948789f8372d" containerID="5f89e9eebd27f28e3866baf9e0b53c080f0fd5403d21844e8a4aee922c1fa769" exitCode=0 Jan 27 15:51:23 crc kubenswrapper[4713]: I0127 15:51:23.026883 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fkchc" event={"ID":"dd505900-1549-404e-acfa-948789f8372d","Type":"ContainerDied","Data":"5f89e9eebd27f28e3866baf9e0b53c080f0fd5403d21844e8a4aee922c1fa769"} Jan 27 15:51:24 crc kubenswrapper[4713]: I0127 15:51:24.035054 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jclch" event={"ID":"9a8987a5-59c3-478b-8b6a-1f2712f6a6f8","Type":"ContainerStarted","Data":"1cdcbcb040c7ca525b871533a77eeb9528acaf43da3aeae214fb8dd2a3648589"} Jan 27 15:51:24 crc kubenswrapper[4713]: I0127 15:51:24.039075 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fkchc" event={"ID":"dd505900-1549-404e-acfa-948789f8372d","Type":"ContainerStarted","Data":"92c90c7f1a6ea3a28a22b1968615735aaf9b3f624a4cbddd22defbd5e7d5206a"} Jan 27 15:51:24 crc kubenswrapper[4713]: I0127 15:51:24.041478 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwtsb" event={"ID":"c255f079-b4c8-4c29-8e77-28e56c9a9ecf","Type":"ContainerStarted","Data":"ad5614e6b504f3300a4d45c971b7fb35d8f3ab75ef896d72d2e46d3b56ba6296"} Jan 27 15:51:24 crc kubenswrapper[4713]: I0127 15:51:24.058994 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jclch" podStartSLOduration=2.56362282 podStartE2EDuration="5.058966914s" podCreationTimestamp="2026-01-27 15:51:19 +0000 UTC" firstStartedPulling="2026-01-27 15:51:20.951970678 +0000 UTC m=+248.730180616" lastFinishedPulling="2026-01-27 15:51:23.447314762 +0000 UTC m=+251.225524710" observedRunningTime="2026-01-27 15:51:24.053808524 +0000 UTC m=+251.832018462" watchObservedRunningTime="2026-01-27 15:51:24.058966914 +0000 UTC m=+251.837176852" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.071083 4713 generic.go:334] "Generic (PLEG): container finished" podID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" containerID="ad5614e6b504f3300a4d45c971b7fb35d8f3ab75ef896d72d2e46d3b56ba6296" exitCode=0 Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.071807 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwtsb" event={"ID":"c255f079-b4c8-4c29-8e77-28e56c9a9ecf","Type":"ContainerDied","Data":"ad5614e6b504f3300a4d45c971b7fb35d8f3ab75ef896d72d2e46d3b56ba6296"} Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.090739 4713 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.092185 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fkchc" podStartSLOduration=3.608710679 podStartE2EDuration="6.092157207s" podCreationTimestamp="2026-01-27 15:51:19 +0000 UTC" firstStartedPulling="2026-01-27 15:51:20.955373397 +0000 UTC m=+248.733583345" lastFinishedPulling="2026-01-27 15:51:23.438819935 +0000 UTC m=+251.217029873" observedRunningTime="2026-01-27 15:51:24.110912735 +0000 UTC m=+251.889122683" watchObservedRunningTime="2026-01-27 15:51:25.092157207 +0000 UTC m=+252.870367145" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.092485 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.103841 4713 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.104612 4713 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 15:51:25 crc kubenswrapper[4713]: E0127 15:51:25.105331 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.105351 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 27 15:51:25 crc kubenswrapper[4713]: E0127 15:51:25.105390 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.105397 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 15:51:25 crc kubenswrapper[4713]: E0127 15:51:25.105411 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.105418 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:51:25 crc kubenswrapper[4713]: E0127 15:51:25.105426 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.105458 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 15:51:25 crc kubenswrapper[4713]: E0127 15:51:25.105471 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.105478 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:51:25 crc kubenswrapper[4713]: E0127 15:51:25.105516 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.105523 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 15:51:25 crc kubenswrapper[4713]: E0127 15:51:25.105534 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.105540 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.105996 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.106078 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.106090 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.106048 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038" gracePeriod=15 Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.106165 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735" gracePeriod=15 Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.106190 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac" gracePeriod=15 Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.106580 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061" gracePeriod=15 Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.106656 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4" gracePeriod=15 Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.106109 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.106937 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.106971 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.164909 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.170687 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.170789 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.170820 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.170884 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.170911 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.170938 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.170967 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.171016 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.271713 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.271794 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.271832 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.271836 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.271876 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.271897 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.271902 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.271937 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.271949 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.271971 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.271992 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.271996 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.272021 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.272068 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.272068 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.271972 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: I0127 15:51:25.460580 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:51:25 crc kubenswrapper[4713]: E0127 15:51:25.464158 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.107:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-rwtsb.188ea153eeb70126 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-rwtsb,UID:c255f079-b4c8-4c29-8e77-28e56c9a9ecf,APIVersion:v1,ResourceVersion:29714,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 387ms (387ms including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 15:51:25.46251191 +0000 UTC m=+253.240721868,LastTimestamp:2026-01-27 15:51:25.46251191 +0000 UTC m=+253.240721868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 15:51:25 crc kubenswrapper[4713]: W0127 15:51:25.494009 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-00188965c2ae56d8de5eeff27d6250a9c840b5bbdefb424112766ef8fd87db8b WatchSource:0}: Error finding container 00188965c2ae56d8de5eeff27d6250a9c840b5bbdefb424112766ef8fd87db8b: Status 404 returned error can't find the container with id 00188965c2ae56d8de5eeff27d6250a9c840b5bbdefb424112766ef8fd87db8b Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.080410 4713 generic.go:334] "Generic (PLEG): container finished" podID="3b10c0e8-a755-4134-a03c-4c84d3e05238" containerID="591e44fd633f88eb4daeeeaf0f44dd914dc7837dc204854267365805405cee98" exitCode=0 Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.080517 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3b10c0e8-a755-4134-a03c-4c84d3e05238","Type":"ContainerDied","Data":"591e44fd633f88eb4daeeeaf0f44dd914dc7837dc204854267365805405cee98"} Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.082184 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.082745 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.083283 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.083604 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"00188965c2ae56d8de5eeff27d6250a9c840b5bbdefb424112766ef8fd87db8b"} Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.086718 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwtsb" event={"ID":"c255f079-b4c8-4c29-8e77-28e56c9a9ecf","Type":"ContainerStarted","Data":"34bb448fbf100671c3ee9c0a027180de976c431c3a83cc1a58ef5820a21210c3"} Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.087614 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.088056 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.088515 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.088801 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.089304 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.090965 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.091658 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac" exitCode=0 Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.091684 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735" exitCode=0 Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.091697 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4" exitCode=0 Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.091708 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061" exitCode=2 Jan 27 15:51:26 crc kubenswrapper[4713]: I0127 15:51:26.091763 4713 scope.go:117] "RemoveContainer" containerID="d6f7d5111fb01f5bd3995517e4df654fd8a15eeed12072275a858294f23e1f46" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.099268 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d8a49a95a57ace26bb9f5f2b76d3662c8a88064a53cdf39542dc93ff1ff98e56"} Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.101524 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.102248 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.102817 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.307455 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.308321 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.434221 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.435717 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.436550 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.437224 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.437586 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.572725 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.573847 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.574211 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.574698 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.574916 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.725912 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b10c0e8-a755-4134-a03c-4c84d3e05238-kube-api-access\") pod \"3b10c0e8-a755-4134-a03c-4c84d3e05238\" (UID: \"3b10c0e8-a755-4134-a03c-4c84d3e05238\") " Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.726422 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b10c0e8-a755-4134-a03c-4c84d3e05238-var-lock\") pod \"3b10c0e8-a755-4134-a03c-4c84d3e05238\" (UID: \"3b10c0e8-a755-4134-a03c-4c84d3e05238\") " Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.726468 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b10c0e8-a755-4134-a03c-4c84d3e05238-kubelet-dir\") pod \"3b10c0e8-a755-4134-a03c-4c84d3e05238\" (UID: \"3b10c0e8-a755-4134-a03c-4c84d3e05238\") " Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.726570 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b10c0e8-a755-4134-a03c-4c84d3e05238-var-lock" (OuterVolumeSpecName: "var-lock") pod "3b10c0e8-a755-4134-a03c-4c84d3e05238" (UID: "3b10c0e8-a755-4134-a03c-4c84d3e05238"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.726678 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b10c0e8-a755-4134-a03c-4c84d3e05238-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3b10c0e8-a755-4134-a03c-4c84d3e05238" (UID: "3b10c0e8-a755-4134-a03c-4c84d3e05238"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.727075 4713 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b10c0e8-a755-4134-a03c-4c84d3e05238-var-lock\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.727094 4713 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b10c0e8-a755-4134-a03c-4c84d3e05238-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.733884 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b10c0e8-a755-4134-a03c-4c84d3e05238-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3b10c0e8-a755-4134-a03c-4c84d3e05238" (UID: "3b10c0e8-a755-4134-a03c-4c84d3e05238"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:51:27 crc kubenswrapper[4713]: I0127 15:51:27.827872 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b10c0e8-a755-4134-a03c-4c84d3e05238-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.067190 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.068008 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.068464 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.068813 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.069672 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.069957 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.070414 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.107590 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.110931 4713 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038" exitCode=0 Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.111047 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.111064 4713 scope.go:117] "RemoveContainer" containerID="dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.115382 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.115461 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"3b10c0e8-a755-4134-a03c-4c84d3e05238","Type":"ContainerDied","Data":"a2d9483666d7fd5fd68c810817033eda831131995b0cf4826fb50a8096074140"} Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.115658 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a2d9483666d7fd5fd68c810817033eda831131995b0cf4826fb50a8096074140" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.132317 4713 scope.go:117] "RemoveContainer" containerID="2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.135379 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.135703 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.136329 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.136958 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.137797 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.147797 4713 scope.go:117] "RemoveContainer" containerID="0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.164243 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-w2fhp" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.165093 4713 scope.go:117] "RemoveContainer" containerID="d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.165295 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.165990 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.166972 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.167666 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.168148 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.184453 4713 scope.go:117] "RemoveContainer" containerID="363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.208759 4713 scope.go:117] "RemoveContainer" containerID="e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.232220 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.232406 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.232446 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.233194 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.233246 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.233319 4713 scope.go:117] "RemoveContainer" containerID="dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.233331 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:51:28 crc kubenswrapper[4713]: E0127 15:51:28.234250 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\": container with ID starting with dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac not found: ID does not exist" containerID="dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.234750 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac"} err="failed to get container status \"dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\": rpc error: code = NotFound desc = could not find container \"dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac\": container with ID starting with dc6f7fcb1391e8fbb655209e9872094f3d94eba4fe1deda53d8ea5a30b7d81ac not found: ID does not exist" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.234866 4713 scope.go:117] "RemoveContainer" containerID="2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735" Jan 27 15:51:28 crc kubenswrapper[4713]: E0127 15:51:28.235475 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\": container with ID starting with 2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735 not found: ID does not exist" containerID="2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.235534 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735"} err="failed to get container status \"2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\": rpc error: code = NotFound desc = could not find container \"2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735\": container with ID starting with 2268d287aa34bccf381665a33c90625484e853d1520ca832af95f8c6fd96b735 not found: ID does not exist" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.235597 4713 scope.go:117] "RemoveContainer" containerID="0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4" Jan 27 15:51:28 crc kubenswrapper[4713]: E0127 15:51:28.235970 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\": container with ID starting with 0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4 not found: ID does not exist" containerID="0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.236108 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4"} err="failed to get container status \"0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\": rpc error: code = NotFound desc = could not find container \"0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4\": container with ID starting with 0de88a99268daefef735a41662ab9d2547bf3ff739316688c976cfed558c56b4 not found: ID does not exist" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.236241 4713 scope.go:117] "RemoveContainer" containerID="d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061" Jan 27 15:51:28 crc kubenswrapper[4713]: E0127 15:51:28.236956 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\": container with ID starting with d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061 not found: ID does not exist" containerID="d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.237000 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061"} err="failed to get container status \"d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\": rpc error: code = NotFound desc = could not find container \"d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061\": container with ID starting with d76a8556df0cedc86009b8a431c56d9502026f25c9035adcb25a998beaee8061 not found: ID does not exist" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.237060 4713 scope.go:117] "RemoveContainer" containerID="363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038" Jan 27 15:51:28 crc kubenswrapper[4713]: E0127 15:51:28.237321 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\": container with ID starting with 363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038 not found: ID does not exist" containerID="363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.237348 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038"} err="failed to get container status \"363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\": rpc error: code = NotFound desc = could not find container \"363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038\": container with ID starting with 363725e2971cb9ae7fad4b30777b8546ec03b1d7c3f252614e5ea5046397f038 not found: ID does not exist" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.237367 4713 scope.go:117] "RemoveContainer" containerID="e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb" Jan 27 15:51:28 crc kubenswrapper[4713]: E0127 15:51:28.237651 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\": container with ID starting with e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb not found: ID does not exist" containerID="e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.237677 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb"} err="failed to get container status \"e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\": rpc error: code = NotFound desc = could not find container \"e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb\": container with ID starting with e257be4483de23bb8faff75c4731460c450a4c2510e2cea21926e7d3542e58fb not found: ID does not exist" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.334632 4713 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.334683 4713 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.334699 4713 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.456072 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.456445 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.457226 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.458138 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.458437 4713 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:28 crc kubenswrapper[4713]: I0127 15:51:28.907131 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 27 15:51:28 crc kubenswrapper[4713]: E0127 15:51:28.981535 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.107:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-rwtsb.188ea153eeb70126 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-rwtsb,UID:c255f079-b4c8-4c29-8e77-28e56c9a9ecf,APIVersion:v1,ResourceVersion:29714,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 387ms (387ms including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 15:51:25.46251191 +0000 UTC m=+253.240721868,LastTimestamp:2026-01-27 15:51:25.46251191 +0000 UTC m=+253.240721868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 15:51:29 crc kubenswrapper[4713]: I0127 15:51:29.661889 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:29 crc kubenswrapper[4713]: I0127 15:51:29.664175 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:29 crc kubenswrapper[4713]: I0127 15:51:29.872761 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:29 crc kubenswrapper[4713]: I0127 15:51:29.872825 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:29 crc kubenswrapper[4713]: I0127 15:51:29.921880 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:29 crc kubenswrapper[4713]: I0127 15:51:29.922865 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:29 crc kubenswrapper[4713]: I0127 15:51:29.923687 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:29 crc kubenswrapper[4713]: I0127 15:51:29.924294 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:29 crc kubenswrapper[4713]: I0127 15:51:29.924931 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:29 crc kubenswrapper[4713]: I0127 15:51:29.925338 4713 status_manager.go:851] "Failed to get status for pod" podUID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" pod="openshift-marketplace/community-operators-jclch" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jclch\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:30 crc kubenswrapper[4713]: I0127 15:51:30.166851 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jclch" Jan 27 15:51:30 crc kubenswrapper[4713]: I0127 15:51:30.167585 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:30 crc kubenswrapper[4713]: I0127 15:51:30.167974 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:30 crc kubenswrapper[4713]: I0127 15:51:30.168743 4713 status_manager.go:851] "Failed to get status for pod" podUID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" pod="openshift-marketplace/community-operators-jclch" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jclch\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:30 crc kubenswrapper[4713]: I0127 15:51:30.169307 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:30 crc kubenswrapper[4713]: I0127 15:51:30.169645 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:30 crc kubenswrapper[4713]: I0127 15:51:30.713596 4713 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fkchc" podUID="dd505900-1549-404e-acfa-948789f8372d" containerName="registry-server" probeResult="failure" output=< Jan 27 15:51:30 crc kubenswrapper[4713]: timeout: failed to connect service ":50051" within 1s Jan 27 15:51:30 crc kubenswrapper[4713]: > Jan 27 15:51:32 crc kubenswrapper[4713]: E0127 15:51:32.268507 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:32 crc kubenswrapper[4713]: E0127 15:51:32.270166 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:32 crc kubenswrapper[4713]: E0127 15:51:32.270799 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:32 crc kubenswrapper[4713]: E0127 15:51:32.271119 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:32 crc kubenswrapper[4713]: E0127 15:51:32.271596 4713 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:32 crc kubenswrapper[4713]: I0127 15:51:32.271679 4713 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 27 15:51:32 crc kubenswrapper[4713]: E0127 15:51:32.272233 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="200ms" Jan 27 15:51:32 crc kubenswrapper[4713]: I0127 15:51:32.302442 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:32 crc kubenswrapper[4713]: I0127 15:51:32.302531 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:32 crc kubenswrapper[4713]: I0127 15:51:32.347849 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:32 crc kubenswrapper[4713]: I0127 15:51:32.348535 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:32 crc kubenswrapper[4713]: I0127 15:51:32.348787 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:32 crc kubenswrapper[4713]: I0127 15:51:32.350774 4713 status_manager.go:851] "Failed to get status for pod" podUID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" pod="openshift-marketplace/community-operators-jclch" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jclch\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:32 crc kubenswrapper[4713]: I0127 15:51:32.351518 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:32 crc kubenswrapper[4713]: I0127 15:51:32.351847 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:32 crc kubenswrapper[4713]: E0127 15:51:32.474181 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="400ms" Jan 27 15:51:32 crc kubenswrapper[4713]: E0127 15:51:32.876108 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="800ms" Jan 27 15:51:32 crc kubenswrapper[4713]: I0127 15:51:32.904312 4713 status_manager.go:851] "Failed to get status for pod" podUID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" pod="openshift-marketplace/community-operators-jclch" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jclch\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:32 crc kubenswrapper[4713]: I0127 15:51:32.905167 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:32 crc kubenswrapper[4713]: I0127 15:51:32.905794 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:32 crc kubenswrapper[4713]: I0127 15:51:32.906270 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:32 crc kubenswrapper[4713]: I0127 15:51:32.906566 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:33 crc kubenswrapper[4713]: I0127 15:51:33.192819 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:51:33 crc kubenswrapper[4713]: I0127 15:51:33.193513 4713 status_manager.go:851] "Failed to get status for pod" podUID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" pod="openshift-marketplace/community-operators-jclch" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jclch\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:33 crc kubenswrapper[4713]: I0127 15:51:33.193915 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:33 crc kubenswrapper[4713]: I0127 15:51:33.194442 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:33 crc kubenswrapper[4713]: I0127 15:51:33.194685 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:33 crc kubenswrapper[4713]: I0127 15:51:33.195133 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:33 crc kubenswrapper[4713]: E0127 15:51:33.677218 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="1.6s" Jan 27 15:51:33 crc kubenswrapper[4713]: I0127 15:51:33.770288 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" containerName="oauth-openshift" containerID="cri-o://6265789ef3cd01c1421715ddc99bcde0b2439109a66dc5cdb6bb38dc73c49e5e" gracePeriod=15 Jan 27 15:51:35 crc kubenswrapper[4713]: E0127 15:51:35.278554 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="3.2s" Jan 27 15:51:36 crc kubenswrapper[4713]: I0127 15:51:36.163031 4713 generic.go:334] "Generic (PLEG): container finished" podID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" containerID="6265789ef3cd01c1421715ddc99bcde0b2439109a66dc5cdb6bb38dc73c49e5e" exitCode=0 Jan 27 15:51:36 crc kubenswrapper[4713]: I0127 15:51:36.163076 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" event={"ID":"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d","Type":"ContainerDied","Data":"6265789ef3cd01c1421715ddc99bcde0b2439109a66dc5cdb6bb38dc73c49e5e"} Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.116515 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.117347 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.117894 4713 status_manager.go:851] "Failed to get status for pod" podUID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" pod="openshift-marketplace/community-operators-jclch" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jclch\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.118226 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.118493 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.118796 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.119135 4713 status_manager.go:851] "Failed to get status for pod" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t92fx\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.184919 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" event={"ID":"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d","Type":"ContainerDied","Data":"0a8011bcb96d74949f2ef2251a509aebc51f1e9654c4ff714b21ddafd0c7fc57"} Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.184986 4713 scope.go:117] "RemoveContainer" containerID="6265789ef3cd01c1421715ddc99bcde0b2439109a66dc5cdb6bb38dc73c49e5e" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.185016 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.185890 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.186459 4713 status_manager.go:851] "Failed to get status for pod" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t92fx\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.186631 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.186773 4713 status_manager.go:851] "Failed to get status for pod" podUID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" pod="openshift-marketplace/community-operators-jclch" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jclch\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.186972 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.187217 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.279461 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-error\") pod \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.279548 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grm2d\" (UniqueName: \"kubernetes.io/projected/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-kube-api-access-grm2d\") pod \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.279579 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-provider-selection\") pod \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.279628 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-idp-0-file-data\") pod \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.279668 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-serving-cert\") pod \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.279692 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-trusted-ca-bundle\") pod \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.279722 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-ocp-branding-template\") pod \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.279748 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-service-ca\") pod \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.279772 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-router-certs\") pod \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.279879 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-session\") pod \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.279930 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-audit-dir\") pod \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.279965 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-audit-policies\") pod \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.280054 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" (UID: "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.280097 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-cliconfig\") pod \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.280248 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-login\") pod \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\" (UID: \"4bb42033-3de8-4e48-a4e6-288b9fb3dc8d\") " Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.280668 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" (UID: "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.280754 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" (UID: "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.280912 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.280943 4713 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.280959 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.281602 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" (UID: "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.281739 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" (UID: "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.285490 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-kube-api-access-grm2d" (OuterVolumeSpecName: "kube-api-access-grm2d") pod "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" (UID: "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d"). InnerVolumeSpecName "kube-api-access-grm2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.287528 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" (UID: "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.287529 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" (UID: "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.297310 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" (UID: "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.297698 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" (UID: "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.297878 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" (UID: "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.298094 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" (UID: "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.299077 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" (UID: "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.300720 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" (UID: "4bb42033-3de8-4e48-a4e6-288b9fb3dc8d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.382722 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.382882 4713 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.382899 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.382914 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.382926 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grm2d\" (UniqueName: \"kubernetes.io/projected/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-kube-api-access-grm2d\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.382936 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.382947 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.382955 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.382964 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.382973 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.382983 4713 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.498988 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.499338 4713 status_manager.go:851] "Failed to get status for pod" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t92fx\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.499780 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.500370 4713 status_manager.go:851] "Failed to get status for pod" podUID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" pod="openshift-marketplace/community-operators-jclch" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jclch\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.500606 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:37 crc kubenswrapper[4713]: I0127 15:51:37.500852 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:38 crc kubenswrapper[4713]: E0127 15:51:38.480096 4713 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.107:6443: connect: connection refused" interval="6.4s" Jan 27 15:51:38 crc kubenswrapper[4713]: E0127 15:51:38.983721 4713 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.102.83.107:6443: connect: connection refused" event="&Event{ObjectMeta:{redhat-marketplace-rwtsb.188ea153eeb70126 openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-rwtsb,UID:c255f079-b4c8-4c29-8e77-28e56c9a9ecf,APIVersion:v1,ResourceVersion:29714,FieldPath:spec.containers{registry-server},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\" in 387ms (387ms including waiting). Image size: 907837715 bytes.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-27 15:51:25.46251191 +0000 UTC m=+253.240721868,LastTimestamp:2026-01-27 15:51:25.46251191 +0000 UTC m=+253.240721868,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.200348 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.200413 4713 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68" exitCode=1 Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.200457 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68"} Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.201077 4713 scope.go:117] "RemoveContainer" containerID="b1c4637336e4a36e874fa1f6ace957f8ec0c4354e5468bcfd383b6598e011c68" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.202148 4713 status_manager.go:851] "Failed to get status for pod" podUID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" pod="openshift-marketplace/community-operators-jclch" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jclch\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.202500 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.203405 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.204276 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.204851 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.205616 4713 status_manager.go:851] "Failed to get status for pod" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t92fx\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.206115 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.707418 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.708192 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.708621 4713 status_manager.go:851] "Failed to get status for pod" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t92fx\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.708860 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.709079 4713 status_manager.go:851] "Failed to get status for pod" podUID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" pod="openshift-marketplace/community-operators-jclch" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jclch\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.709272 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.709489 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.709701 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.709938 4713 status_manager.go:851] "Failed to get status for pod" podUID="dd505900-1549-404e-acfa-948789f8372d" pod="openshift-marketplace/redhat-operators-fkchc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fkchc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.748262 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fkchc" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.748893 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.749332 4713 status_manager.go:851] "Failed to get status for pod" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t92fx\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.749877 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.750194 4713 status_manager.go:851] "Failed to get status for pod" podUID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" pod="openshift-marketplace/community-operators-jclch" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jclch\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.750510 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.750777 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.751064 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.751280 4713 status_manager.go:851] "Failed to get status for pod" podUID="dd505900-1549-404e-acfa-948789f8372d" pod="openshift-marketplace/redhat-operators-fkchc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fkchc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.899310 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.900385 4713 status_manager.go:851] "Failed to get status for pod" podUID="dd505900-1549-404e-acfa-948789f8372d" pod="openshift-marketplace/redhat-operators-fkchc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fkchc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.901269 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.901571 4713 status_manager.go:851] "Failed to get status for pod" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t92fx\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.901834 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.902075 4713 status_manager.go:851] "Failed to get status for pod" podUID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" pod="openshift-marketplace/community-operators-jclch" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jclch\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.902255 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.902434 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.902597 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.916549 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8671861-b7cf-4af6-96cc-fcde117c229e" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.916605 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8671861-b7cf-4af6-96cc-fcde117c229e" Jan 27 15:51:39 crc kubenswrapper[4713]: E0127 15:51:39.917186 4713 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:39 crc kubenswrapper[4713]: I0127 15:51:39.917692 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:39 crc kubenswrapper[4713]: W0127 15:51:39.948249 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-129a855d808fa3aa9b98fb201a2d99d8f6006595b3912875f0b0d1c5911ab3de WatchSource:0}: Error finding container 129a855d808fa3aa9b98fb201a2d99d8f6006595b3912875f0b0d1c5911ab3de: Status 404 returned error can't find the container with id 129a855d808fa3aa9b98fb201a2d99d8f6006595b3912875f0b0d1c5911ab3de Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.207677 4713 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="1343a05892c2120396fc74fb17280522e0f7bac97e9a8117ff79564b17b9498d" exitCode=0 Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.207813 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"1343a05892c2120396fc74fb17280522e0f7bac97e9a8117ff79564b17b9498d"} Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.208212 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"129a855d808fa3aa9b98fb201a2d99d8f6006595b3912875f0b0d1c5911ab3de"} Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.208533 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8671861-b7cf-4af6-96cc-fcde117c229e" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.208563 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8671861-b7cf-4af6-96cc-fcde117c229e" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.209069 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: E0127 15:51:40.209158 4713 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.209416 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.209817 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.210595 4713 status_manager.go:851] "Failed to get status for pod" podUID="dd505900-1549-404e-acfa-948789f8372d" pod="openshift-marketplace/redhat-operators-fkchc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fkchc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.211251 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.211432 4713 status_manager.go:851] "Failed to get status for pod" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t92fx\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.211673 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.211983 4713 status_manager.go:851] "Failed to get status for pod" podUID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" pod="openshift-marketplace/community-operators-jclch" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jclch\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.215403 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.216126 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ce6b8ddf2e8dd7d1c7ffc69ad22421d6308191a834d84e9710e1f6f169f5ce22"} Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.217139 4713 status_manager.go:851] "Failed to get status for pod" podUID="9a8987a5-59c3-478b-8b6a-1f2712f6a6f8" pod="openshift-marketplace/community-operators-jclch" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-jclch\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.218241 4713 status_manager.go:851] "Failed to get status for pod" podUID="ebb9c0b9-bd38-4496-990a-ab1d02cc792b" pod="openshift-marketplace/certified-operators-w2fhp" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-w2fhp\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.218641 4713 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.219215 4713 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.219476 4713 status_manager.go:851] "Failed to get status for pod" podUID="dd505900-1549-404e-acfa-948789f8372d" pod="openshift-marketplace/redhat-operators-fkchc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-fkchc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.219676 4713 status_manager.go:851] "Failed to get status for pod" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.219980 4713 status_manager.go:851] "Failed to get status for pod" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" pod="openshift-authentication/oauth-openshift-558db77b4-t92fx" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-t92fx\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:40 crc kubenswrapper[4713]: I0127 15:51:40.220345 4713 status_manager.go:851] "Failed to get status for pod" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" pod="openshift-marketplace/redhat-marketplace-rwtsb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-marketplace-rwtsb\": dial tcp 38.102.83.107:6443: connect: connection refused" Jan 27 15:51:41 crc kubenswrapper[4713]: I0127 15:51:41.229191 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2a86f0992b64647ec6ab3ca03aecd5fa9401f7ed996cbeb6e0f57ee1d350de1c"} Jan 27 15:51:41 crc kubenswrapper[4713]: I0127 15:51:41.229259 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"91fe143522c8580c2f1eb45cb926f2c9b63256f6bfdae7f4515697a11dfc5467"} Jan 27 15:51:41 crc kubenswrapper[4713]: I0127 15:51:41.229274 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"03e6accad8e006ea2665b0514e1c70989d9d2147127357a7bbd70a5f06fb4d13"} Jan 27 15:51:41 crc kubenswrapper[4713]: I0127 15:51:41.229291 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"8e36b61dfd758eb24bdaa1ec2cb02ee0ad4f221aba9ae2b57220050693bb1339"} Jan 27 15:51:42 crc kubenswrapper[4713]: I0127 15:51:42.239484 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"f36880d12f8e82b80b33fd8491edd1b0525ea67e1ff2734cef91d21382e67a51"} Jan 27 15:51:42 crc kubenswrapper[4713]: I0127 15:51:42.239895 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:42 crc kubenswrapper[4713]: I0127 15:51:42.239879 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8671861-b7cf-4af6-96cc-fcde117c229e" Jan 27 15:51:42 crc kubenswrapper[4713]: I0127 15:51:42.239923 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8671861-b7cf-4af6-96cc-fcde117c229e" Jan 27 15:51:42 crc kubenswrapper[4713]: I0127 15:51:42.437898 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:51:44 crc kubenswrapper[4713]: I0127 15:51:44.918877 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:44 crc kubenswrapper[4713]: I0127 15:51:44.920666 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:44 crc kubenswrapper[4713]: I0127 15:51:44.925904 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:45 crc kubenswrapper[4713]: I0127 15:51:45.890842 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:51:45 crc kubenswrapper[4713]: I0127 15:51:45.895340 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.003871 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" podUID="4994bfc4-7e92-4877-a981-6d94b4df000a" containerName="registry" containerID="cri-o://a22ace44e1f5d186c5f05236660b55dfa2b296cc07956272ae6d73c498806db6" gracePeriod=30 Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.257413 4713 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.271488 4713 generic.go:334] "Generic (PLEG): container finished" podID="4994bfc4-7e92-4877-a981-6d94b4df000a" containerID="a22ace44e1f5d186c5f05236660b55dfa2b296cc07956272ae6d73c498806db6" exitCode=0 Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.271549 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" event={"ID":"4994bfc4-7e92-4877-a981-6d94b4df000a","Type":"ContainerDied","Data":"a22ace44e1f5d186c5f05236660b55dfa2b296cc07956272ae6d73c498806db6"} Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.426123 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.525490 4713 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="fd5aa2d4-4da4-4f83-8ead-e5cee33b32b5" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.526555 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4994bfc4-7e92-4877-a981-6d94b4df000a-installation-pull-secrets\") pod \"4994bfc4-7e92-4877-a981-6d94b4df000a\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.526884 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"4994bfc4-7e92-4877-a981-6d94b4df000a\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.526992 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4994bfc4-7e92-4877-a981-6d94b4df000a-ca-trust-extracted\") pod \"4994bfc4-7e92-4877-a981-6d94b4df000a\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.527117 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4994bfc4-7e92-4877-a981-6d94b4df000a-trusted-ca\") pod \"4994bfc4-7e92-4877-a981-6d94b4df000a\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.527199 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-registry-tls\") pod \"4994bfc4-7e92-4877-a981-6d94b4df000a\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.527282 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4994bfc4-7e92-4877-a981-6d94b4df000a-registry-certificates\") pod \"4994bfc4-7e92-4877-a981-6d94b4df000a\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.527377 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbct5\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-kube-api-access-hbct5\") pod \"4994bfc4-7e92-4877-a981-6d94b4df000a\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.527461 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-bound-sa-token\") pod \"4994bfc4-7e92-4877-a981-6d94b4df000a\" (UID: \"4994bfc4-7e92-4877-a981-6d94b4df000a\") " Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.530226 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4994bfc4-7e92-4877-a981-6d94b4df000a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4994bfc4-7e92-4877-a981-6d94b4df000a" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.530364 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4994bfc4-7e92-4877-a981-6d94b4df000a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4994bfc4-7e92-4877-a981-6d94b4df000a" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.537718 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4994bfc4-7e92-4877-a981-6d94b4df000a" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.538429 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4994bfc4-7e92-4877-a981-6d94b4df000a" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.549820 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4994bfc4-7e92-4877-a981-6d94b4df000a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4994bfc4-7e92-4877-a981-6d94b4df000a" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.549854 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "4994bfc4-7e92-4877-a981-6d94b4df000a" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.561982 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-kube-api-access-hbct5" (OuterVolumeSpecName: "kube-api-access-hbct5") pod "4994bfc4-7e92-4877-a981-6d94b4df000a" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a"). InnerVolumeSpecName "kube-api-access-hbct5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.565707 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4994bfc4-7e92-4877-a981-6d94b4df000a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4994bfc4-7e92-4877-a981-6d94b4df000a" (UID: "4994bfc4-7e92-4877-a981-6d94b4df000a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.629208 4713 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4994bfc4-7e92-4877-a981-6d94b4df000a-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.629265 4713 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4994bfc4-7e92-4877-a981-6d94b4df000a-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.629280 4713 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4994bfc4-7e92-4877-a981-6d94b4df000a-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.629292 4713 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.629304 4713 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4994bfc4-7e92-4877-a981-6d94b4df000a-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.629315 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbct5\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-kube-api-access-hbct5\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:47 crc kubenswrapper[4713]: I0127 15:51:47.629325 4713 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4994bfc4-7e92-4877-a981-6d94b4df000a-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 27 15:51:48 crc kubenswrapper[4713]: I0127 15:51:48.280438 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" event={"ID":"4994bfc4-7e92-4877-a981-6d94b4df000a","Type":"ContainerDied","Data":"daa8ff730b22b66d1845c5aa1735bbe07dc8f2902cb530c9c8311963cd33d80e"} Jan 27 15:51:48 crc kubenswrapper[4713]: I0127 15:51:48.280931 4713 scope.go:117] "RemoveContainer" containerID="a22ace44e1f5d186c5f05236660b55dfa2b296cc07956272ae6d73c498806db6" Jan 27 15:51:48 crc kubenswrapper[4713]: I0127 15:51:48.280480 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-jjr22" Jan 27 15:51:48 crc kubenswrapper[4713]: I0127 15:51:48.280844 4713 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8671861-b7cf-4af6-96cc-fcde117c229e" Jan 27 15:51:48 crc kubenswrapper[4713]: I0127 15:51:48.281106 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c8671861-b7cf-4af6-96cc-fcde117c229e" Jan 27 15:51:48 crc kubenswrapper[4713]: I0127 15:51:48.286342 4713 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="fd5aa2d4-4da4-4f83-8ead-e5cee33b32b5" Jan 27 15:51:52 crc kubenswrapper[4713]: I0127 15:51:52.442407 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 27 15:51:57 crc kubenswrapper[4713]: I0127 15:51:57.551945 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 27 15:51:57 crc kubenswrapper[4713]: I0127 15:51:57.699835 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 27 15:51:57 crc kubenswrapper[4713]: I0127 15:51:57.842028 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 27 15:51:57 crc kubenswrapper[4713]: I0127 15:51:57.875777 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 27 15:51:57 crc kubenswrapper[4713]: I0127 15:51:57.982924 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 27 15:51:57 crc kubenswrapper[4713]: I0127 15:51:57.999514 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 27 15:51:58 crc kubenswrapper[4713]: I0127 15:51:58.297630 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 27 15:51:58 crc kubenswrapper[4713]: I0127 15:51:58.610134 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 15:51:58 crc kubenswrapper[4713]: I0127 15:51:58.764392 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 27 15:51:58 crc kubenswrapper[4713]: I0127 15:51:58.857110 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 27 15:51:58 crc kubenswrapper[4713]: I0127 15:51:58.958166 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 27 15:51:59 crc kubenswrapper[4713]: I0127 15:51:59.092120 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 27 15:51:59 crc kubenswrapper[4713]: I0127 15:51:59.106817 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 27 15:51:59 crc kubenswrapper[4713]: I0127 15:51:59.193740 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 27 15:51:59 crc kubenswrapper[4713]: I0127 15:51:59.222656 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 27 15:51:59 crc kubenswrapper[4713]: I0127 15:51:59.257709 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 27 15:51:59 crc kubenswrapper[4713]: I0127 15:51:59.297582 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 27 15:51:59 crc kubenswrapper[4713]: I0127 15:51:59.342258 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 27 15:51:59 crc kubenswrapper[4713]: I0127 15:51:59.347530 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 15:51:59 crc kubenswrapper[4713]: I0127 15:51:59.399164 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 27 15:51:59 crc kubenswrapper[4713]: I0127 15:51:59.500345 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 27 15:51:59 crc kubenswrapper[4713]: I0127 15:51:59.619926 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 27 15:51:59 crc kubenswrapper[4713]: I0127 15:51:59.867599 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 27 15:51:59 crc kubenswrapper[4713]: I0127 15:51:59.991825 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 27 15:52:00 crc kubenswrapper[4713]: I0127 15:52:00.240655 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 27 15:52:00 crc kubenswrapper[4713]: I0127 15:52:00.459297 4713 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 27 15:52:00 crc kubenswrapper[4713]: I0127 15:52:00.488787 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 27 15:52:00 crc kubenswrapper[4713]: I0127 15:52:00.566901 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 27 15:52:00 crc kubenswrapper[4713]: I0127 15:52:00.629056 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 27 15:52:00 crc kubenswrapper[4713]: I0127 15:52:00.781316 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 27 15:52:00 crc kubenswrapper[4713]: I0127 15:52:00.788726 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 27 15:52:00 crc kubenswrapper[4713]: I0127 15:52:00.795415 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 27 15:52:00 crc kubenswrapper[4713]: I0127 15:52:00.803343 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 27 15:52:00 crc kubenswrapper[4713]: I0127 15:52:00.968503 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 27 15:52:01 crc kubenswrapper[4713]: I0127 15:52:01.170269 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 27 15:52:01 crc kubenswrapper[4713]: I0127 15:52:01.228120 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 27 15:52:01 crc kubenswrapper[4713]: I0127 15:52:01.346342 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 27 15:52:01 crc kubenswrapper[4713]: I0127 15:52:01.480357 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 27 15:52:01 crc kubenswrapper[4713]: I0127 15:52:01.499351 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 27 15:52:01 crc kubenswrapper[4713]: I0127 15:52:01.540689 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 27 15:52:01 crc kubenswrapper[4713]: I0127 15:52:01.656248 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 27 15:52:01 crc kubenswrapper[4713]: I0127 15:52:01.673467 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 27 15:52:01 crc kubenswrapper[4713]: I0127 15:52:01.822799 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 27 15:52:01 crc kubenswrapper[4713]: I0127 15:52:01.882013 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 27 15:52:01 crc kubenswrapper[4713]: I0127 15:52:01.916736 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 27 15:52:01 crc kubenswrapper[4713]: I0127 15:52:01.928798 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.020551 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.143704 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.167121 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.223108 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.253766 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.257733 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.364221 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.398769 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.417854 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.472976 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.488967 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.495938 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.557420 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.561985 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.723993 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.758705 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.850496 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 27 15:52:02 crc kubenswrapper[4713]: I0127 15:52:02.937262 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.069733 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.069733 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.121427 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.152407 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.153893 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.234221 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.267602 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.296348 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.402212 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.411113 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.440679 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.550360 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.556569 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.637131 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.692375 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.729575 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.737638 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.796927 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.831852 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.915922 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.939318 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 27 15:52:03 crc kubenswrapper[4713]: I0127 15:52:03.976614 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 27 15:52:04 crc kubenswrapper[4713]: I0127 15:52:04.022551 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 27 15:52:04 crc kubenswrapper[4713]: I0127 15:52:04.036747 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 27 15:52:04 crc kubenswrapper[4713]: I0127 15:52:04.084857 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 27 15:52:04 crc kubenswrapper[4713]: I0127 15:52:04.114635 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 27 15:52:04 crc kubenswrapper[4713]: I0127 15:52:04.151827 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 27 15:52:04 crc kubenswrapper[4713]: I0127 15:52:04.204570 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 27 15:52:04 crc kubenswrapper[4713]: I0127 15:52:04.317505 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 27 15:52:04 crc kubenswrapper[4713]: I0127 15:52:04.444654 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 27 15:52:04 crc kubenswrapper[4713]: I0127 15:52:04.523892 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 15:52:04 crc kubenswrapper[4713]: I0127 15:52:04.548119 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 27 15:52:04 crc kubenswrapper[4713]: I0127 15:52:04.727461 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 27 15:52:04 crc kubenswrapper[4713]: I0127 15:52:04.917002 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 27 15:52:04 crc kubenswrapper[4713]: I0127 15:52:04.934665 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 27 15:52:04 crc kubenswrapper[4713]: I0127 15:52:04.977479 4713 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.234550 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.273774 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.280740 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.291350 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.304951 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.323917 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.331967 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.423813 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.550905 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.575482 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.587932 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.609494 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.610398 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.645722 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.718874 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.731002 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.734843 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.782025 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.830999 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 27 15:52:05 crc kubenswrapper[4713]: I0127 15:52:05.989809 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.053563 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.075771 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.232925 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.267376 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.334006 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.490263 4713 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.504947 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.534027 4713 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.594669 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.615561 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.615746 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.707890 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.762185 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.791453 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.799330 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.853888 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.855086 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.974242 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 27 15:52:06 crc kubenswrapper[4713]: I0127 15:52:06.977346 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.034112 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.171679 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.270860 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.305391 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.354797 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.364574 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.388375 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.458204 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.492818 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.507167 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.510860 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.565195 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.587298 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.606340 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.610761 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.613000 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.754925 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.758566 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.776234 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.804104 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.849562 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 27 15:52:07 crc kubenswrapper[4713]: I0127 15:52:07.910874 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.017315 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.204869 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.248255 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.299857 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.328392 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.335485 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.338102 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.338694 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.362109 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.543200 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.560540 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.575365 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.672352 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.699121 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.747442 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.879099 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 27 15:52:08 crc kubenswrapper[4713]: I0127 15:52:08.990675 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.081106 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.128196 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.270758 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.272608 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.304619 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.307524 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.410884 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.442901 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.531394 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.595390 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.698115 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.737088 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.746918 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.841925 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.859225 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.941560 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.950064 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.994342 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 27 15:52:09 crc kubenswrapper[4713]: I0127 15:52:09.994653 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.126433 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.139283 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.168783 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.369466 4713 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.370229 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rwtsb" podStartSLOduration=46.931564422 podStartE2EDuration="49.370211546s" podCreationTimestamp="2026-01-27 15:51:21 +0000 UTC" firstStartedPulling="2026-01-27 15:51:23.023648699 +0000 UTC m=+250.801858637" lastFinishedPulling="2026-01-27 15:51:25.462295823 +0000 UTC m=+253.240505761" observedRunningTime="2026-01-27 15:51:47.464594842 +0000 UTC m=+275.242804800" watchObservedRunningTime="2026-01-27 15:52:10.370211546 +0000 UTC m=+298.148421484" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.372157 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.374275 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=45.374254724 podStartE2EDuration="45.374254724s" podCreationTimestamp="2026-01-27 15:51:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:51:47.332621018 +0000 UTC m=+275.110830966" watchObservedRunningTime="2026-01-27 15:52:10.374254724 +0000 UTC m=+298.152464662" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.374785 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-t92fx","openshift-image-registry/image-registry-697d97f7c8-jjr22","openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.374855 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.379446 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.394396 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=23.39436987 podStartE2EDuration="23.39436987s" podCreationTimestamp="2026-01-27 15:51:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:52:10.391875217 +0000 UTC m=+298.170085175" watchObservedRunningTime="2026-01-27 15:52:10.39436987 +0000 UTC m=+298.172579808" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.412889 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.425497 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.539499 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.676654 4713 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.775362 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.907226 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4994bfc4-7e92-4877-a981-6d94b4df000a" path="/var/lib/kubelet/pods/4994bfc4-7e92-4877-a981-6d94b4df000a/volumes" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.908223 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" path="/var/lib/kubelet/pods/4bb42033-3de8-4e48-a4e6-288b9fb3dc8d/volumes" Jan 27 15:52:10 crc kubenswrapper[4713]: I0127 15:52:10.977444 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 27 15:52:11 crc kubenswrapper[4713]: I0127 15:52:11.007978 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 27 15:52:11 crc kubenswrapper[4713]: I0127 15:52:11.023887 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 27 15:52:11 crc kubenswrapper[4713]: I0127 15:52:11.092759 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 27 15:52:11 crc kubenswrapper[4713]: I0127 15:52:11.166840 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 27 15:52:11 crc kubenswrapper[4713]: I0127 15:52:11.187852 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 27 15:52:11 crc kubenswrapper[4713]: I0127 15:52:11.238319 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 27 15:52:11 crc kubenswrapper[4713]: I0127 15:52:11.351670 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 27 15:52:11 crc kubenswrapper[4713]: I0127 15:52:11.425529 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 15:52:11 crc kubenswrapper[4713]: I0127 15:52:11.509328 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 15:52:11 crc kubenswrapper[4713]: I0127 15:52:11.628707 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 27 15:52:11 crc kubenswrapper[4713]: I0127 15:52:11.738111 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 15:52:11 crc kubenswrapper[4713]: I0127 15:52:11.878210 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 27 15:52:11 crc kubenswrapper[4713]: I0127 15:52:11.957981 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 27 15:52:12 crc kubenswrapper[4713]: I0127 15:52:12.002451 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 27 15:52:12 crc kubenswrapper[4713]: I0127 15:52:12.024241 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 27 15:52:12 crc kubenswrapper[4713]: I0127 15:52:12.041356 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 27 15:52:12 crc kubenswrapper[4713]: I0127 15:52:12.087670 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 27 15:52:12 crc kubenswrapper[4713]: I0127 15:52:12.188997 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 27 15:52:12 crc kubenswrapper[4713]: I0127 15:52:12.203510 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 27 15:52:12 crc kubenswrapper[4713]: I0127 15:52:12.324354 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 27 15:52:12 crc kubenswrapper[4713]: I0127 15:52:12.406595 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 15:52:12 crc kubenswrapper[4713]: I0127 15:52:12.495522 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 27 15:52:12 crc kubenswrapper[4713]: I0127 15:52:12.496323 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 27 15:52:12 crc kubenswrapper[4713]: I0127 15:52:12.504434 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 27 15:52:12 crc kubenswrapper[4713]: I0127 15:52:12.703445 4713 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 27 15:52:12 crc kubenswrapper[4713]: I0127 15:52:12.801871 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 15:52:12 crc kubenswrapper[4713]: I0127 15:52:12.924987 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 27 15:52:13 crc kubenswrapper[4713]: I0127 15:52:13.043921 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 27 15:52:13 crc kubenswrapper[4713]: I0127 15:52:13.730848 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.230590 4713 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.231954 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://d8a49a95a57ace26bb9f5f2b76d3662c8a88064a53cdf39542dc93ff1ff98e56" gracePeriod=5 Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.274913 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7d9c768c99-8ds4c"] Jan 27 15:52:21 crc kubenswrapper[4713]: E0127 15:52:21.275231 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.275246 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 15:52:21 crc kubenswrapper[4713]: E0127 15:52:21.275263 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" containerName="installer" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.275274 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" containerName="installer" Jan 27 15:52:21 crc kubenswrapper[4713]: E0127 15:52:21.275298 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4994bfc4-7e92-4877-a981-6d94b4df000a" containerName="registry" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.275306 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4994bfc4-7e92-4877-a981-6d94b4df000a" containerName="registry" Jan 27 15:52:21 crc kubenswrapper[4713]: E0127 15:52:21.275320 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" containerName="oauth-openshift" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.275328 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" containerName="oauth-openshift" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.275468 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b10c0e8-a755-4134-a03c-4c84d3e05238" containerName="installer" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.275484 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.275494 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="4bb42033-3de8-4e48-a4e6-288b9fb3dc8d" containerName="oauth-openshift" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.275509 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="4994bfc4-7e92-4877-a981-6d94b4df000a" containerName="registry" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.276165 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.284599 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.284675 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.284718 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.284857 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.284631 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.285661 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.285887 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.286116 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.286240 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.286953 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.287289 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.291298 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.294816 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7d9c768c99-8ds4c"] Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.296364 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.301371 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.301762 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.400167 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-session\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.400275 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.400354 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-user-template-error\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.400430 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.400487 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.400520 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.400608 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-user-template-login\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.400673 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.400769 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.400806 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.400833 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e5d8317c-4b34-4e90-9c06-f8266e3d8820-audit-policies\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.400857 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ldl7\" (UniqueName: \"kubernetes.io/projected/e5d8317c-4b34-4e90-9c06-f8266e3d8820-kube-api-access-4ldl7\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.400882 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e5d8317c-4b34-4e90-9c06-f8266e3d8820-audit-dir\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.400909 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.505718 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.505768 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e5d8317c-4b34-4e90-9c06-f8266e3d8820-audit-policies\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.505790 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ldl7\" (UniqueName: \"kubernetes.io/projected/e5d8317c-4b34-4e90-9c06-f8266e3d8820-kube-api-access-4ldl7\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.505814 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e5d8317c-4b34-4e90-9c06-f8266e3d8820-audit-dir\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.505839 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.505889 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-session\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.505908 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.505941 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-user-template-error\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.505984 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.506009 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.506097 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.506140 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-user-template-login\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.506163 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.506211 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.507502 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-service-ca\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.507849 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.507915 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e5d8317c-4b34-4e90-9c06-f8266e3d8820-audit-dir\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.507969 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e5d8317c-4b34-4e90-9c06-f8266e3d8820-audit-policies\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.508382 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.513798 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.513832 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-session\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.514340 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.514371 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-router-certs\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.522306 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-user-template-error\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.523717 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.524114 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ldl7\" (UniqueName: \"kubernetes.io/projected/e5d8317c-4b34-4e90-9c06-f8266e3d8820-kube-api-access-4ldl7\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.529980 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.533961 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e5d8317c-4b34-4e90-9c06-f8266e3d8820-v4-0-config-user-template-login\") pod \"oauth-openshift-7d9c768c99-8ds4c\" (UID: \"e5d8317c-4b34-4e90-9c06-f8266e3d8820\") " pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.600585 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:21 crc kubenswrapper[4713]: I0127 15:52:21.805324 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7d9c768c99-8ds4c"] Jan 27 15:52:22 crc kubenswrapper[4713]: I0127 15:52:22.520665 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" event={"ID":"e5d8317c-4b34-4e90-9c06-f8266e3d8820","Type":"ContainerStarted","Data":"14c870c324381cd1586511aa4ae6f04c934e283d1738510d2648d2f2d260d259"} Jan 27 15:52:22 crc kubenswrapper[4713]: I0127 15:52:22.521481 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" event={"ID":"e5d8317c-4b34-4e90-9c06-f8266e3d8820","Type":"ContainerStarted","Data":"f7ff8daf2f1845c4ea71fba38d46e32cc99668438cbcecb0dc62a3ba32bf9a30"} Jan 27 15:52:22 crc kubenswrapper[4713]: I0127 15:52:22.524804 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:22 crc kubenswrapper[4713]: I0127 15:52:22.556489 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" podStartSLOduration=74.556464253 podStartE2EDuration="1m14.556464253s" podCreationTimestamp="2026-01-27 15:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:52:22.546140912 +0000 UTC m=+310.324350850" watchObservedRunningTime="2026-01-27 15:52:22.556464253 +0000 UTC m=+310.334674191" Jan 27 15:52:22 crc kubenswrapper[4713]: I0127 15:52:22.676111 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7d9c768c99-8ds4c" Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.554981 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.555893 4713 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="d8a49a95a57ace26bb9f5f2b76d3662c8a88064a53cdf39542dc93ff1ff98e56" exitCode=137 Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.555957 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="00188965c2ae56d8de5eeff27d6250a9c840b5bbdefb424112766ef8fd87db8b" Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.574159 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.574279 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.616538 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.616665 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.616712 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.616737 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.616724 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.616805 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.616814 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.616886 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.617109 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.617282 4713 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.617296 4713 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.617306 4713 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.617315 4713 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.628306 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:52:27 crc kubenswrapper[4713]: I0127 15:52:27.719184 4713 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:28 crc kubenswrapper[4713]: I0127 15:52:28.561911 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 27 15:52:28 crc kubenswrapper[4713]: I0127 15:52:28.907398 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 27 15:52:28 crc kubenswrapper[4713]: I0127 15:52:28.907715 4713 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 27 15:52:28 crc kubenswrapper[4713]: I0127 15:52:28.920346 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 15:52:28 crc kubenswrapper[4713]: I0127 15:52:28.920409 4713 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8db66e29-3d30-4b45-beb6-3f07631878ee" Jan 27 15:52:28 crc kubenswrapper[4713]: I0127 15:52:28.923908 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 27 15:52:28 crc kubenswrapper[4713]: I0127 15:52:28.923976 4713 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="8db66e29-3d30-4b45-beb6-3f07631878ee" Jan 27 15:52:32 crc kubenswrapper[4713]: I0127 15:52:32.501625 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.051525 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7vcrd"] Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.054335 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" podUID="ad390cc5-3b01-4343-97a2-2c4385fe5142" containerName="controller-manager" containerID="cri-o://71c625933859e92258d9c426cd20976b48bf766ae208cea0739acb315cd88867" gracePeriod=30 Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.152558 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm"] Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.153049 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" podUID="aae489c7-6a71-4465-9c25-7d27eb68b318" containerName="route-controller-manager" containerID="cri-o://17f0f4749132245e2116a10e5b1ebf4ab41e0dd5a2f1bdc21fb1919b0738f03b" gracePeriod=30 Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.464069 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.572849 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-client-ca\") pod \"ad390cc5-3b01-4343-97a2-2c4385fe5142\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.573071 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-proxy-ca-bundles\") pod \"ad390cc5-3b01-4343-97a2-2c4385fe5142\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.573115 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-config\") pod \"ad390cc5-3b01-4343-97a2-2c4385fe5142\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.573180 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad390cc5-3b01-4343-97a2-2c4385fe5142-serving-cert\") pod \"ad390cc5-3b01-4343-97a2-2c4385fe5142\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.573213 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rtzfz\" (UniqueName: \"kubernetes.io/projected/ad390cc5-3b01-4343-97a2-2c4385fe5142-kube-api-access-rtzfz\") pod \"ad390cc5-3b01-4343-97a2-2c4385fe5142\" (UID: \"ad390cc5-3b01-4343-97a2-2c4385fe5142\") " Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.573868 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-client-ca" (OuterVolumeSpecName: "client-ca") pod "ad390cc5-3b01-4343-97a2-2c4385fe5142" (UID: "ad390cc5-3b01-4343-97a2-2c4385fe5142"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.573971 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "ad390cc5-3b01-4343-97a2-2c4385fe5142" (UID: "ad390cc5-3b01-4343-97a2-2c4385fe5142"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.573991 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-config" (OuterVolumeSpecName: "config") pod "ad390cc5-3b01-4343-97a2-2c4385fe5142" (UID: "ad390cc5-3b01-4343-97a2-2c4385fe5142"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.582818 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad390cc5-3b01-4343-97a2-2c4385fe5142-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ad390cc5-3b01-4343-97a2-2c4385fe5142" (UID: "ad390cc5-3b01-4343-97a2-2c4385fe5142"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.600474 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad390cc5-3b01-4343-97a2-2c4385fe5142-kube-api-access-rtzfz" (OuterVolumeSpecName: "kube-api-access-rtzfz") pod "ad390cc5-3b01-4343-97a2-2c4385fe5142" (UID: "ad390cc5-3b01-4343-97a2-2c4385fe5142"). InnerVolumeSpecName "kube-api-access-rtzfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.619806 4713 generic.go:334] "Generic (PLEG): container finished" podID="ad390cc5-3b01-4343-97a2-2c4385fe5142" containerID="71c625933859e92258d9c426cd20976b48bf766ae208cea0739acb315cd88867" exitCode=0 Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.620199 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" event={"ID":"ad390cc5-3b01-4343-97a2-2c4385fe5142","Type":"ContainerDied","Data":"71c625933859e92258d9c426cd20976b48bf766ae208cea0739acb315cd88867"} Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.620493 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" event={"ID":"ad390cc5-3b01-4343-97a2-2c4385fe5142","Type":"ContainerDied","Data":"462ac6c603c122c595919aa6ef1f9e487ac5c51c3f30a82456d2abed7f61d0e2"} Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.620384 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-7vcrd" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.620782 4713 scope.go:117] "RemoveContainer" containerID="71c625933859e92258d9c426cd20976b48bf766ae208cea0739acb315cd88867" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.623048 4713 generic.go:334] "Generic (PLEG): container finished" podID="aae489c7-6a71-4465-9c25-7d27eb68b318" containerID="17f0f4749132245e2116a10e5b1ebf4ab41e0dd5a2f1bdc21fb1919b0738f03b" exitCode=0 Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.623150 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" event={"ID":"aae489c7-6a71-4465-9c25-7d27eb68b318","Type":"ContainerDied","Data":"17f0f4749132245e2116a10e5b1ebf4ab41e0dd5a2f1bdc21fb1919b0738f03b"} Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.644222 4713 scope.go:117] "RemoveContainer" containerID="71c625933859e92258d9c426cd20976b48bf766ae208cea0739acb315cd88867" Jan 27 15:52:38 crc kubenswrapper[4713]: E0127 15:52:38.644988 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71c625933859e92258d9c426cd20976b48bf766ae208cea0739acb315cd88867\": container with ID starting with 71c625933859e92258d9c426cd20976b48bf766ae208cea0739acb315cd88867 not found: ID does not exist" containerID="71c625933859e92258d9c426cd20976b48bf766ae208cea0739acb315cd88867" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.645105 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71c625933859e92258d9c426cd20976b48bf766ae208cea0739acb315cd88867"} err="failed to get container status \"71c625933859e92258d9c426cd20976b48bf766ae208cea0739acb315cd88867\": rpc error: code = NotFound desc = could not find container \"71c625933859e92258d9c426cd20976b48bf766ae208cea0739acb315cd88867\": container with ID starting with 71c625933859e92258d9c426cd20976b48bf766ae208cea0739acb315cd88867 not found: ID does not exist" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.651316 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7vcrd"] Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.654984 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-7vcrd"] Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.674541 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ad390cc5-3b01-4343-97a2-2c4385fe5142-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.674586 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rtzfz\" (UniqueName: \"kubernetes.io/projected/ad390cc5-3b01-4343-97a2-2c4385fe5142-kube-api-access-rtzfz\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.674598 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.674672 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.674689 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ad390cc5-3b01-4343-97a2-2c4385fe5142-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:38 crc kubenswrapper[4713]: I0127 15:52:38.909332 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad390cc5-3b01-4343-97a2-2c4385fe5142" path="/var/lib/kubelet/pods/ad390cc5-3b01-4343-97a2-2c4385fe5142/volumes" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.265440 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.292919 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-78bf788545-5gzcp"] Jan 27 15:52:39 crc kubenswrapper[4713]: E0127 15:52:39.293298 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad390cc5-3b01-4343-97a2-2c4385fe5142" containerName="controller-manager" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.293315 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad390cc5-3b01-4343-97a2-2c4385fe5142" containerName="controller-manager" Jan 27 15:52:39 crc kubenswrapper[4713]: E0127 15:52:39.293346 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae489c7-6a71-4465-9c25-7d27eb68b318" containerName="route-controller-manager" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.293352 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae489c7-6a71-4465-9c25-7d27eb68b318" containerName="route-controller-manager" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.293464 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae489c7-6a71-4465-9c25-7d27eb68b318" containerName="route-controller-manager" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.293477 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad390cc5-3b01-4343-97a2-2c4385fe5142" containerName="controller-manager" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.294026 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.297344 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.297507 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.298304 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.298545 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.298683 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.300124 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.307394 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78bf788545-5gzcp"] Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.317490 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.385926 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6r4t\" (UniqueName: \"kubernetes.io/projected/aae489c7-6a71-4465-9c25-7d27eb68b318-kube-api-access-k6r4t\") pod \"aae489c7-6a71-4465-9c25-7d27eb68b318\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.386089 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aae489c7-6a71-4465-9c25-7d27eb68b318-client-ca\") pod \"aae489c7-6a71-4465-9c25-7d27eb68b318\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.386222 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae489c7-6a71-4465-9c25-7d27eb68b318-config\") pod \"aae489c7-6a71-4465-9c25-7d27eb68b318\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.386301 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aae489c7-6a71-4465-9c25-7d27eb68b318-serving-cert\") pod \"aae489c7-6a71-4465-9c25-7d27eb68b318\" (UID: \"aae489c7-6a71-4465-9c25-7d27eb68b318\") " Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.386589 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq7pg\" (UniqueName: \"kubernetes.io/projected/99e40583-3b90-4857-93cf-422ee00b717d-kube-api-access-cq7pg\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.386638 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99e40583-3b90-4857-93cf-422ee00b717d-serving-cert\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.386689 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-client-ca\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.387577 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae489c7-6a71-4465-9c25-7d27eb68b318-client-ca" (OuterVolumeSpecName: "client-ca") pod "aae489c7-6a71-4465-9c25-7d27eb68b318" (UID: "aae489c7-6a71-4465-9c25-7d27eb68b318"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.387627 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae489c7-6a71-4465-9c25-7d27eb68b318-config" (OuterVolumeSpecName: "config") pod "aae489c7-6a71-4465-9c25-7d27eb68b318" (UID: "aae489c7-6a71-4465-9c25-7d27eb68b318"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.387792 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-config\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.387835 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-proxy-ca-bundles\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.387899 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/aae489c7-6a71-4465-9c25-7d27eb68b318-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.387917 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae489c7-6a71-4465-9c25-7d27eb68b318-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.393787 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aae489c7-6a71-4465-9c25-7d27eb68b318-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "aae489c7-6a71-4465-9c25-7d27eb68b318" (UID: "aae489c7-6a71-4465-9c25-7d27eb68b318"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.396797 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae489c7-6a71-4465-9c25-7d27eb68b318-kube-api-access-k6r4t" (OuterVolumeSpecName: "kube-api-access-k6r4t") pod "aae489c7-6a71-4465-9c25-7d27eb68b318" (UID: "aae489c7-6a71-4465-9c25-7d27eb68b318"). InnerVolumeSpecName "kube-api-access-k6r4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.489423 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-config\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.489477 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-proxy-ca-bundles\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.489520 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cq7pg\" (UniqueName: \"kubernetes.io/projected/99e40583-3b90-4857-93cf-422ee00b717d-kube-api-access-cq7pg\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.489562 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99e40583-3b90-4857-93cf-422ee00b717d-serving-cert\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.489603 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-client-ca\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.489661 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/aae489c7-6a71-4465-9c25-7d27eb68b318-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.489676 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6r4t\" (UniqueName: \"kubernetes.io/projected/aae489c7-6a71-4465-9c25-7d27eb68b318-kube-api-access-k6r4t\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.490665 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-client-ca\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.491125 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-config\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.492011 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-proxy-ca-bundles\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.494379 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99e40583-3b90-4857-93cf-422ee00b717d-serving-cert\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.508831 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq7pg\" (UniqueName: \"kubernetes.io/projected/99e40583-3b90-4857-93cf-422ee00b717d-kube-api-access-cq7pg\") pod \"controller-manager-78bf788545-5gzcp\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.620360 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.637961 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" event={"ID":"aae489c7-6a71-4465-9c25-7d27eb68b318","Type":"ContainerDied","Data":"804ba3da8821a4db1b9097d1e82fde3602625b707cb58dfdc7606edfd52bd3ad"} Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.638397 4713 scope.go:117] "RemoveContainer" containerID="17f0f4749132245e2116a10e5b1ebf4ab41e0dd5a2f1bdc21fb1919b0738f03b" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.638349 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm" Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.676876 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm"] Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.680964 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-h5jwm"] Jan 27 15:52:39 crc kubenswrapper[4713]: I0127 15:52:39.836516 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-78bf788545-5gzcp"] Jan 27 15:52:40 crc kubenswrapper[4713]: I0127 15:52:40.652860 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" event={"ID":"99e40583-3b90-4857-93cf-422ee00b717d","Type":"ContainerStarted","Data":"1cb193a27821182674ba002a48cab0411da7e98a611e09bea659e0c6f2957a03"} Jan 27 15:52:40 crc kubenswrapper[4713]: I0127 15:52:40.653283 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" event={"ID":"99e40583-3b90-4857-93cf-422ee00b717d","Type":"ContainerStarted","Data":"f498cdf74955784d5d9a51323be6ae2b1b6642486910bd7792e6e86b2862c0c4"} Jan 27 15:52:40 crc kubenswrapper[4713]: I0127 15:52:40.653309 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:40 crc kubenswrapper[4713]: I0127 15:52:40.659411 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:40 crc kubenswrapper[4713]: I0127 15:52:40.674390 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" podStartSLOduration=2.674364577 podStartE2EDuration="2.674364577s" podCreationTimestamp="2026-01-27 15:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:52:40.670618927 +0000 UTC m=+328.448828855" watchObservedRunningTime="2026-01-27 15:52:40.674364577 +0000 UTC m=+328.452574515" Jan 27 15:52:40 crc kubenswrapper[4713]: I0127 15:52:40.907642 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae489c7-6a71-4465-9c25-7d27eb68b318" path="/var/lib/kubelet/pods/aae489c7-6a71-4465-9c25-7d27eb68b318/volumes" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.292484 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x"] Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.293514 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.296556 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.297020 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.297028 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.297282 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.297508 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.297573 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.308899 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x"] Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.419946 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b838a898-6391-4ed3-af8f-49ecba8d76dd-config\") pod \"route-controller-manager-8679d84dcf-zr76x\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.420031 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b838a898-6391-4ed3-af8f-49ecba8d76dd-serving-cert\") pod \"route-controller-manager-8679d84dcf-zr76x\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.420131 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b838a898-6391-4ed3-af8f-49ecba8d76dd-client-ca\") pod \"route-controller-manager-8679d84dcf-zr76x\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.420156 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbf5v\" (UniqueName: \"kubernetes.io/projected/b838a898-6391-4ed3-af8f-49ecba8d76dd-kube-api-access-hbf5v\") pod \"route-controller-manager-8679d84dcf-zr76x\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.521699 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b838a898-6391-4ed3-af8f-49ecba8d76dd-config\") pod \"route-controller-manager-8679d84dcf-zr76x\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.521776 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b838a898-6391-4ed3-af8f-49ecba8d76dd-serving-cert\") pod \"route-controller-manager-8679d84dcf-zr76x\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.521844 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbf5v\" (UniqueName: \"kubernetes.io/projected/b838a898-6391-4ed3-af8f-49ecba8d76dd-kube-api-access-hbf5v\") pod \"route-controller-manager-8679d84dcf-zr76x\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.521869 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b838a898-6391-4ed3-af8f-49ecba8d76dd-client-ca\") pod \"route-controller-manager-8679d84dcf-zr76x\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.522893 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b838a898-6391-4ed3-af8f-49ecba8d76dd-client-ca\") pod \"route-controller-manager-8679d84dcf-zr76x\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.524564 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b838a898-6391-4ed3-af8f-49ecba8d76dd-config\") pod \"route-controller-manager-8679d84dcf-zr76x\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.537920 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.543602 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b838a898-6391-4ed3-af8f-49ecba8d76dd-serving-cert\") pod \"route-controller-manager-8679d84dcf-zr76x\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.549186 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbf5v\" (UniqueName: \"kubernetes.io/projected/b838a898-6391-4ed3-af8f-49ecba8d76dd-kube-api-access-hbf5v\") pod \"route-controller-manager-8679d84dcf-zr76x\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.614655 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:41 crc kubenswrapper[4713]: I0127 15:52:41.884836 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x"] Jan 27 15:52:42 crc kubenswrapper[4713]: I0127 15:52:42.666674 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" event={"ID":"b838a898-6391-4ed3-af8f-49ecba8d76dd","Type":"ContainerStarted","Data":"09ff280fa7d1c0b5ada73c46cf906f9b002c08500e34368989ed5e8524037b50"} Jan 27 15:52:42 crc kubenswrapper[4713]: I0127 15:52:42.667153 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" event={"ID":"b838a898-6391-4ed3-af8f-49ecba8d76dd","Type":"ContainerStarted","Data":"f2a0083aa6c0c23dab99b1e99d2844b8c47e42c56ec3f292b996b25b70029430"} Jan 27 15:52:42 crc kubenswrapper[4713]: I0127 15:52:42.687170 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" podStartSLOduration=4.687147145 podStartE2EDuration="4.687147145s" podCreationTimestamp="2026-01-27 15:52:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:52:42.685199598 +0000 UTC m=+330.463409546" watchObservedRunningTime="2026-01-27 15:52:42.687147145 +0000 UTC m=+330.465357093" Jan 27 15:52:43 crc kubenswrapper[4713]: I0127 15:52:43.672089 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:43 crc kubenswrapper[4713]: I0127 15:52:43.678354 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.059592 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78bf788545-5gzcp"] Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.061020 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" podUID="99e40583-3b90-4857-93cf-422ee00b717d" containerName="controller-manager" containerID="cri-o://1cb193a27821182674ba002a48cab0411da7e98a611e09bea659e0c6f2957a03" gracePeriod=30 Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.067802 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x"] Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.068057 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" podUID="b838a898-6391-4ed3-af8f-49ecba8d76dd" containerName="route-controller-manager" containerID="cri-o://09ff280fa7d1c0b5ada73c46cf906f9b002c08500e34368989ed5e8524037b50" gracePeriod=30 Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.617525 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.689590 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.713847 4713 generic.go:334] "Generic (PLEG): container finished" podID="b838a898-6391-4ed3-af8f-49ecba8d76dd" containerID="09ff280fa7d1c0b5ada73c46cf906f9b002c08500e34368989ed5e8524037b50" exitCode=0 Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.713926 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" event={"ID":"b838a898-6391-4ed3-af8f-49ecba8d76dd","Type":"ContainerDied","Data":"09ff280fa7d1c0b5ada73c46cf906f9b002c08500e34368989ed5e8524037b50"} Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.713965 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" event={"ID":"b838a898-6391-4ed3-af8f-49ecba8d76dd","Type":"ContainerDied","Data":"f2a0083aa6c0c23dab99b1e99d2844b8c47e42c56ec3f292b996b25b70029430"} Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.713986 4713 scope.go:117] "RemoveContainer" containerID="09ff280fa7d1c0b5ada73c46cf906f9b002c08500e34368989ed5e8524037b50" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.714131 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.717448 4713 generic.go:334] "Generic (PLEG): container finished" podID="99e40583-3b90-4857-93cf-422ee00b717d" containerID="1cb193a27821182674ba002a48cab0411da7e98a611e09bea659e0c6f2957a03" exitCode=0 Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.717511 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.717525 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" event={"ID":"99e40583-3b90-4857-93cf-422ee00b717d","Type":"ContainerDied","Data":"1cb193a27821182674ba002a48cab0411da7e98a611e09bea659e0c6f2957a03"} Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.717620 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-78bf788545-5gzcp" event={"ID":"99e40583-3b90-4857-93cf-422ee00b717d","Type":"ContainerDied","Data":"f498cdf74955784d5d9a51323be6ae2b1b6642486910bd7792e6e86b2862c0c4"} Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.725798 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbf5v\" (UniqueName: \"kubernetes.io/projected/b838a898-6391-4ed3-af8f-49ecba8d76dd-kube-api-access-hbf5v\") pod \"b838a898-6391-4ed3-af8f-49ecba8d76dd\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.725884 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b838a898-6391-4ed3-af8f-49ecba8d76dd-serving-cert\") pod \"b838a898-6391-4ed3-af8f-49ecba8d76dd\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.725931 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b838a898-6391-4ed3-af8f-49ecba8d76dd-config\") pod \"b838a898-6391-4ed3-af8f-49ecba8d76dd\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.725991 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b838a898-6391-4ed3-af8f-49ecba8d76dd-client-ca\") pod \"b838a898-6391-4ed3-af8f-49ecba8d76dd\" (UID: \"b838a898-6391-4ed3-af8f-49ecba8d76dd\") " Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.727420 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b838a898-6391-4ed3-af8f-49ecba8d76dd-client-ca" (OuterVolumeSpecName: "client-ca") pod "b838a898-6391-4ed3-af8f-49ecba8d76dd" (UID: "b838a898-6391-4ed3-af8f-49ecba8d76dd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.731121 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b838a898-6391-4ed3-af8f-49ecba8d76dd-config" (OuterVolumeSpecName: "config") pod "b838a898-6391-4ed3-af8f-49ecba8d76dd" (UID: "b838a898-6391-4ed3-af8f-49ecba8d76dd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.736330 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b838a898-6391-4ed3-af8f-49ecba8d76dd-kube-api-access-hbf5v" (OuterVolumeSpecName: "kube-api-access-hbf5v") pod "b838a898-6391-4ed3-af8f-49ecba8d76dd" (UID: "b838a898-6391-4ed3-af8f-49ecba8d76dd"). InnerVolumeSpecName "kube-api-access-hbf5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.739786 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b838a898-6391-4ed3-af8f-49ecba8d76dd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b838a898-6391-4ed3-af8f-49ecba8d76dd" (UID: "b838a898-6391-4ed3-af8f-49ecba8d76dd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.748260 4713 scope.go:117] "RemoveContainer" containerID="09ff280fa7d1c0b5ada73c46cf906f9b002c08500e34368989ed5e8524037b50" Jan 27 15:52:48 crc kubenswrapper[4713]: E0127 15:52:48.748911 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ff280fa7d1c0b5ada73c46cf906f9b002c08500e34368989ed5e8524037b50\": container with ID starting with 09ff280fa7d1c0b5ada73c46cf906f9b002c08500e34368989ed5e8524037b50 not found: ID does not exist" containerID="09ff280fa7d1c0b5ada73c46cf906f9b002c08500e34368989ed5e8524037b50" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.748955 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ff280fa7d1c0b5ada73c46cf906f9b002c08500e34368989ed5e8524037b50"} err="failed to get container status \"09ff280fa7d1c0b5ada73c46cf906f9b002c08500e34368989ed5e8524037b50\": rpc error: code = NotFound desc = could not find container \"09ff280fa7d1c0b5ada73c46cf906f9b002c08500e34368989ed5e8524037b50\": container with ID starting with 09ff280fa7d1c0b5ada73c46cf906f9b002c08500e34368989ed5e8524037b50 not found: ID does not exist" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.748983 4713 scope.go:117] "RemoveContainer" containerID="1cb193a27821182674ba002a48cab0411da7e98a611e09bea659e0c6f2957a03" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.768732 4713 scope.go:117] "RemoveContainer" containerID="1cb193a27821182674ba002a48cab0411da7e98a611e09bea659e0c6f2957a03" Jan 27 15:52:48 crc kubenswrapper[4713]: E0127 15:52:48.769418 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb193a27821182674ba002a48cab0411da7e98a611e09bea659e0c6f2957a03\": container with ID starting with 1cb193a27821182674ba002a48cab0411da7e98a611e09bea659e0c6f2957a03 not found: ID does not exist" containerID="1cb193a27821182674ba002a48cab0411da7e98a611e09bea659e0c6f2957a03" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.769470 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb193a27821182674ba002a48cab0411da7e98a611e09bea659e0c6f2957a03"} err="failed to get container status \"1cb193a27821182674ba002a48cab0411da7e98a611e09bea659e0c6f2957a03\": rpc error: code = NotFound desc = could not find container \"1cb193a27821182674ba002a48cab0411da7e98a611e09bea659e0c6f2957a03\": container with ID starting with 1cb193a27821182674ba002a48cab0411da7e98a611e09bea659e0c6f2957a03 not found: ID does not exist" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.827365 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq7pg\" (UniqueName: \"kubernetes.io/projected/99e40583-3b90-4857-93cf-422ee00b717d-kube-api-access-cq7pg\") pod \"99e40583-3b90-4857-93cf-422ee00b717d\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.827424 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-config\") pod \"99e40583-3b90-4857-93cf-422ee00b717d\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.827468 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-proxy-ca-bundles\") pod \"99e40583-3b90-4857-93cf-422ee00b717d\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.827495 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-client-ca\") pod \"99e40583-3b90-4857-93cf-422ee00b717d\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.827532 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99e40583-3b90-4857-93cf-422ee00b717d-serving-cert\") pod \"99e40583-3b90-4857-93cf-422ee00b717d\" (UID: \"99e40583-3b90-4857-93cf-422ee00b717d\") " Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.827809 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbf5v\" (UniqueName: \"kubernetes.io/projected/b838a898-6391-4ed3-af8f-49ecba8d76dd-kube-api-access-hbf5v\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.827835 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b838a898-6391-4ed3-af8f-49ecba8d76dd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.827846 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b838a898-6391-4ed3-af8f-49ecba8d76dd-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.827858 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b838a898-6391-4ed3-af8f-49ecba8d76dd-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.828749 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-config" (OuterVolumeSpecName: "config") pod "99e40583-3b90-4857-93cf-422ee00b717d" (UID: "99e40583-3b90-4857-93cf-422ee00b717d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.828768 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "99e40583-3b90-4857-93cf-422ee00b717d" (UID: "99e40583-3b90-4857-93cf-422ee00b717d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.829152 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-client-ca" (OuterVolumeSpecName: "client-ca") pod "99e40583-3b90-4857-93cf-422ee00b717d" (UID: "99e40583-3b90-4857-93cf-422ee00b717d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.831884 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99e40583-3b90-4857-93cf-422ee00b717d-kube-api-access-cq7pg" (OuterVolumeSpecName: "kube-api-access-cq7pg") pod "99e40583-3b90-4857-93cf-422ee00b717d" (UID: "99e40583-3b90-4857-93cf-422ee00b717d"). InnerVolumeSpecName "kube-api-access-cq7pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.832299 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99e40583-3b90-4857-93cf-422ee00b717d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "99e40583-3b90-4857-93cf-422ee00b717d" (UID: "99e40583-3b90-4857-93cf-422ee00b717d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.928705 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.928738 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cq7pg\" (UniqueName: \"kubernetes.io/projected/99e40583-3b90-4857-93cf-422ee00b717d-kube-api-access-cq7pg\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.928752 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.928760 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/99e40583-3b90-4857-93cf-422ee00b717d-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:48 crc kubenswrapper[4713]: I0127 15:52:48.928769 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/99e40583-3b90-4857-93cf-422ee00b717d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.055386 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x"] Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.065113 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8679d84dcf-zr76x"] Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.075894 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-78bf788545-5gzcp"] Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.079206 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-78bf788545-5gzcp"] Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.301230 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-664c7c7bbc-ww29x"] Jan 27 15:52:49 crc kubenswrapper[4713]: E0127 15:52:49.306811 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99e40583-3b90-4857-93cf-422ee00b717d" containerName="controller-manager" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.306840 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="99e40583-3b90-4857-93cf-422ee00b717d" containerName="controller-manager" Jan 27 15:52:49 crc kubenswrapper[4713]: E0127 15:52:49.306855 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b838a898-6391-4ed3-af8f-49ecba8d76dd" containerName="route-controller-manager" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.306864 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="b838a898-6391-4ed3-af8f-49ecba8d76dd" containerName="route-controller-manager" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.307234 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="b838a898-6391-4ed3-af8f-49ecba8d76dd" containerName="route-controller-manager" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.307256 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="99e40583-3b90-4857-93cf-422ee00b717d" containerName="controller-manager" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.307792 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.313510 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w"] Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.314082 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.315708 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w"] Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.315941 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.316234 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.316610 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.316904 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.318895 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-664c7c7bbc-ww29x"] Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.361888 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.362275 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.362519 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.362966 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.364296 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.364660 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.365295 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.366701 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.369901 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.462717 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c8b0c48-1ce4-468a-800f-533509c856fa-client-ca\") pod \"route-controller-manager-7fbfdbf77-k826w\" (UID: \"9c8b0c48-1ce4-468a-800f-533509c856fa\") " pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.462817 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c8b0c48-1ce4-468a-800f-533509c856fa-config\") pod \"route-controller-manager-7fbfdbf77-k826w\" (UID: \"9c8b0c48-1ce4-468a-800f-533509c856fa\") " pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.462850 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-config\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.462910 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70f60468-7829-429c-b61f-fae649eb5824-serving-cert\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.462964 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-client-ca\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.462993 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-proxy-ca-bundles\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.463014 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c8b0c48-1ce4-468a-800f-533509c856fa-serving-cert\") pod \"route-controller-manager-7fbfdbf77-k826w\" (UID: \"9c8b0c48-1ce4-468a-800f-533509c856fa\") " pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.463135 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2g7qb\" (UniqueName: \"kubernetes.io/projected/70f60468-7829-429c-b61f-fae649eb5824-kube-api-access-2g7qb\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.463164 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mr46q\" (UniqueName: \"kubernetes.io/projected/9c8b0c48-1ce4-468a-800f-533509c856fa-kube-api-access-mr46q\") pod \"route-controller-manager-7fbfdbf77-k826w\" (UID: \"9c8b0c48-1ce4-468a-800f-533509c856fa\") " pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.564026 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2g7qb\" (UniqueName: \"kubernetes.io/projected/70f60468-7829-429c-b61f-fae649eb5824-kube-api-access-2g7qb\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.564486 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mr46q\" (UniqueName: \"kubernetes.io/projected/9c8b0c48-1ce4-468a-800f-533509c856fa-kube-api-access-mr46q\") pod \"route-controller-manager-7fbfdbf77-k826w\" (UID: \"9c8b0c48-1ce4-468a-800f-533509c856fa\") " pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.564874 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c8b0c48-1ce4-468a-800f-533509c856fa-client-ca\") pod \"route-controller-manager-7fbfdbf77-k826w\" (UID: \"9c8b0c48-1ce4-468a-800f-533509c856fa\") " pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.565082 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c8b0c48-1ce4-468a-800f-533509c856fa-config\") pod \"route-controller-manager-7fbfdbf77-k826w\" (UID: \"9c8b0c48-1ce4-468a-800f-533509c856fa\") " pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.565245 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-config\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.565385 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70f60468-7829-429c-b61f-fae649eb5824-serving-cert\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.565484 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-client-ca\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.566496 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-proxy-ca-bundles\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.566627 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c8b0c48-1ce4-468a-800f-533509c856fa-serving-cert\") pod \"route-controller-manager-7fbfdbf77-k826w\" (UID: \"9c8b0c48-1ce4-468a-800f-533509c856fa\") " pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.566918 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9c8b0c48-1ce4-468a-800f-533509c856fa-client-ca\") pod \"route-controller-manager-7fbfdbf77-k826w\" (UID: \"9c8b0c48-1ce4-468a-800f-533509c856fa\") " pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.566947 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-client-ca\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.567329 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-config\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.568949 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9c8b0c48-1ce4-468a-800f-533509c856fa-config\") pod \"route-controller-manager-7fbfdbf77-k826w\" (UID: \"9c8b0c48-1ce4-468a-800f-533509c856fa\") " pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.569080 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-proxy-ca-bundles\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.571280 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70f60468-7829-429c-b61f-fae649eb5824-serving-cert\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.571865 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9c8b0c48-1ce4-468a-800f-533509c856fa-serving-cert\") pod \"route-controller-manager-7fbfdbf77-k826w\" (UID: \"9c8b0c48-1ce4-468a-800f-533509c856fa\") " pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.581448 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2g7qb\" (UniqueName: \"kubernetes.io/projected/70f60468-7829-429c-b61f-fae649eb5824-kube-api-access-2g7qb\") pod \"controller-manager-664c7c7bbc-ww29x\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.584916 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mr46q\" (UniqueName: \"kubernetes.io/projected/9c8b0c48-1ce4-468a-800f-533509c856fa-kube-api-access-mr46q\") pod \"route-controller-manager-7fbfdbf77-k826w\" (UID: \"9c8b0c48-1ce4-468a-800f-533509c856fa\") " pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.684793 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:49 crc kubenswrapper[4713]: I0127 15:52:49.695463 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.106959 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-664c7c7bbc-ww29x"] Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.154445 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w"] Jan 27 15:52:50 crc kubenswrapper[4713]: W0127 15:52:50.173566 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c8b0c48_1ce4_468a_800f_533509c856fa.slice/crio-5e20fbb9dc135ba5d9f65ffef0824f082daaf47e7414f1f36f3fb68204ba96be WatchSource:0}: Error finding container 5e20fbb9dc135ba5d9f65ffef0824f082daaf47e7414f1f36f3fb68204ba96be: Status 404 returned error can't find the container with id 5e20fbb9dc135ba5d9f65ffef0824f082daaf47e7414f1f36f3fb68204ba96be Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.737715 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" event={"ID":"70f60468-7829-429c-b61f-fae649eb5824","Type":"ContainerStarted","Data":"d623e6b361e3d838b00dacb4fe8442f6102da1b91c0bef9acc3d1dc3ea5acf71"} Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.738212 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" event={"ID":"70f60468-7829-429c-b61f-fae649eb5824","Type":"ContainerStarted","Data":"026eea3ef5c16b989a9dd0596b2dfd76b05ee98ae1d5f3cb80f172a6828f2dcc"} Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.738951 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.742936 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" event={"ID":"9c8b0c48-1ce4-468a-800f-533509c856fa","Type":"ContainerStarted","Data":"fc3b476357353b6702943f062bd042f1d7039e7f610873dc172651e2edf3367a"} Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.743011 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" event={"ID":"9c8b0c48-1ce4-468a-800f-533509c856fa","Type":"ContainerStarted","Data":"5e20fbb9dc135ba5d9f65ffef0824f082daaf47e7414f1f36f3fb68204ba96be"} Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.744008 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.744149 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.760587 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" podStartSLOduration=2.760551227 podStartE2EDuration="2.760551227s" podCreationTimestamp="2026-01-27 15:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:52:50.757473897 +0000 UTC m=+338.535683835" watchObservedRunningTime="2026-01-27 15:52:50.760551227 +0000 UTC m=+338.538761175" Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.799256 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" podStartSLOduration=2.799229151 podStartE2EDuration="2.799229151s" podCreationTimestamp="2026-01-27 15:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:52:50.793196005 +0000 UTC m=+338.571405943" watchObservedRunningTime="2026-01-27 15:52:50.799229151 +0000 UTC m=+338.577439089" Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.894767 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fbfdbf77-k826w" Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.898197 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-664c7c7bbc-ww29x"] Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.908887 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99e40583-3b90-4857-93cf-422ee00b717d" path="/var/lib/kubelet/pods/99e40583-3b90-4857-93cf-422ee00b717d/volumes" Jan 27 15:52:50 crc kubenswrapper[4713]: I0127 15:52:50.909487 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b838a898-6391-4ed3-af8f-49ecba8d76dd" path="/var/lib/kubelet/pods/b838a898-6391-4ed3-af8f-49ecba8d76dd/volumes" Jan 27 15:52:52 crc kubenswrapper[4713]: I0127 15:52:52.754512 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" podUID="70f60468-7829-429c-b61f-fae649eb5824" containerName="controller-manager" containerID="cri-o://d623e6b361e3d838b00dacb4fe8442f6102da1b91c0bef9acc3d1dc3ea5acf71" gracePeriod=30 Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.185760 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.221315 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5b8f648b64-pxsrf"] Jan 27 15:52:53 crc kubenswrapper[4713]: E0127 15:52:53.221659 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="70f60468-7829-429c-b61f-fae649eb5824" containerName="controller-manager" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.221681 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="70f60468-7829-429c-b61f-fae649eb5824" containerName="controller-manager" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.221782 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="70f60468-7829-429c-b61f-fae649eb5824" containerName="controller-manager" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.222259 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.231167 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b8f648b64-pxsrf"] Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.319590 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-proxy-ca-bundles\") pod \"70f60468-7829-429c-b61f-fae649eb5824\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.319736 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-client-ca\") pod \"70f60468-7829-429c-b61f-fae649eb5824\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.319784 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2g7qb\" (UniqueName: \"kubernetes.io/projected/70f60468-7829-429c-b61f-fae649eb5824-kube-api-access-2g7qb\") pod \"70f60468-7829-429c-b61f-fae649eb5824\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.319817 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-config\") pod \"70f60468-7829-429c-b61f-fae649eb5824\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.319862 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70f60468-7829-429c-b61f-fae649eb5824-serving-cert\") pod \"70f60468-7829-429c-b61f-fae649eb5824\" (UID: \"70f60468-7829-429c-b61f-fae649eb5824\") " Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.320105 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb467cf-40ea-4016-9043-b56fdc37772f-serving-cert\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.320145 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jtlj\" (UniqueName: \"kubernetes.io/projected/5cb467cf-40ea-4016-9043-b56fdc37772f-kube-api-access-4jtlj\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.320189 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-proxy-ca-bundles\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.320232 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-client-ca\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.320273 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-config\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.320904 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-client-ca" (OuterVolumeSpecName: "client-ca") pod "70f60468-7829-429c-b61f-fae649eb5824" (UID: "70f60468-7829-429c-b61f-fae649eb5824"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.321031 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "70f60468-7829-429c-b61f-fae649eb5824" (UID: "70f60468-7829-429c-b61f-fae649eb5824"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.321489 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-config" (OuterVolumeSpecName: "config") pod "70f60468-7829-429c-b61f-fae649eb5824" (UID: "70f60468-7829-429c-b61f-fae649eb5824"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.327814 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/70f60468-7829-429c-b61f-fae649eb5824-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "70f60468-7829-429c-b61f-fae649eb5824" (UID: "70f60468-7829-429c-b61f-fae649eb5824"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.328197 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/70f60468-7829-429c-b61f-fae649eb5824-kube-api-access-2g7qb" (OuterVolumeSpecName: "kube-api-access-2g7qb") pod "70f60468-7829-429c-b61f-fae649eb5824" (UID: "70f60468-7829-429c-b61f-fae649eb5824"). InnerVolumeSpecName "kube-api-access-2g7qb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.421532 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb467cf-40ea-4016-9043-b56fdc37772f-serving-cert\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.421882 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jtlj\" (UniqueName: \"kubernetes.io/projected/5cb467cf-40ea-4016-9043-b56fdc37772f-kube-api-access-4jtlj\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.421923 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-proxy-ca-bundles\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.421950 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-client-ca\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.421983 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-config\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.422029 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.422061 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.422074 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2g7qb\" (UniqueName: \"kubernetes.io/projected/70f60468-7829-429c-b61f-fae649eb5824-kube-api-access-2g7qb\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.422086 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/70f60468-7829-429c-b61f-fae649eb5824-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.422094 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/70f60468-7829-429c-b61f-fae649eb5824-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.423243 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-client-ca\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.423931 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-config\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.424174 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-proxy-ca-bundles\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.425646 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb467cf-40ea-4016-9043-b56fdc37772f-serving-cert\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.439931 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jtlj\" (UniqueName: \"kubernetes.io/projected/5cb467cf-40ea-4016-9043-b56fdc37772f-kube-api-access-4jtlj\") pod \"controller-manager-5b8f648b64-pxsrf\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.549012 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.750654 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5b8f648b64-pxsrf"] Jan 27 15:52:53 crc kubenswrapper[4713]: W0127 15:52:53.755126 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5cb467cf_40ea_4016_9043_b56fdc37772f.slice/crio-dc1102c6133efac7a40586ee7b3f808d89b6430087e56e9c3a3b7ed270932316 WatchSource:0}: Error finding container dc1102c6133efac7a40586ee7b3f808d89b6430087e56e9c3a3b7ed270932316: Status 404 returned error can't find the container with id dc1102c6133efac7a40586ee7b3f808d89b6430087e56e9c3a3b7ed270932316 Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.761307 4713 generic.go:334] "Generic (PLEG): container finished" podID="70f60468-7829-429c-b61f-fae649eb5824" containerID="d623e6b361e3d838b00dacb4fe8442f6102da1b91c0bef9acc3d1dc3ea5acf71" exitCode=0 Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.761378 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" event={"ID":"70f60468-7829-429c-b61f-fae649eb5824","Type":"ContainerDied","Data":"d623e6b361e3d838b00dacb4fe8442f6102da1b91c0bef9acc3d1dc3ea5acf71"} Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.761421 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" event={"ID":"70f60468-7829-429c-b61f-fae649eb5824","Type":"ContainerDied","Data":"026eea3ef5c16b989a9dd0596b2dfd76b05ee98ae1d5f3cb80f172a6828f2dcc"} Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.761450 4713 scope.go:117] "RemoveContainer" containerID="d623e6b361e3d838b00dacb4fe8442f6102da1b91c0bef9acc3d1dc3ea5acf71" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.761535 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-664c7c7bbc-ww29x" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.783592 4713 scope.go:117] "RemoveContainer" containerID="d623e6b361e3d838b00dacb4fe8442f6102da1b91c0bef9acc3d1dc3ea5acf71" Jan 27 15:52:53 crc kubenswrapper[4713]: E0127 15:52:53.784752 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d623e6b361e3d838b00dacb4fe8442f6102da1b91c0bef9acc3d1dc3ea5acf71\": container with ID starting with d623e6b361e3d838b00dacb4fe8442f6102da1b91c0bef9acc3d1dc3ea5acf71 not found: ID does not exist" containerID="d623e6b361e3d838b00dacb4fe8442f6102da1b91c0bef9acc3d1dc3ea5acf71" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.784787 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d623e6b361e3d838b00dacb4fe8442f6102da1b91c0bef9acc3d1dc3ea5acf71"} err="failed to get container status \"d623e6b361e3d838b00dacb4fe8442f6102da1b91c0bef9acc3d1dc3ea5acf71\": rpc error: code = NotFound desc = could not find container \"d623e6b361e3d838b00dacb4fe8442f6102da1b91c0bef9acc3d1dc3ea5acf71\": container with ID starting with d623e6b361e3d838b00dacb4fe8442f6102da1b91c0bef9acc3d1dc3ea5acf71 not found: ID does not exist" Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.798810 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-664c7c7bbc-ww29x"] Jan 27 15:52:53 crc kubenswrapper[4713]: I0127 15:52:53.803128 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-664c7c7bbc-ww29x"] Jan 27 15:52:54 crc kubenswrapper[4713]: I0127 15:52:54.770795 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" event={"ID":"5cb467cf-40ea-4016-9043-b56fdc37772f","Type":"ContainerStarted","Data":"4c8158383a68bc604bb9f45891ac722e40ade5cd3c701ea6efa4e9b32ccfd045"} Jan 27 15:52:54 crc kubenswrapper[4713]: I0127 15:52:54.770845 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" event={"ID":"5cb467cf-40ea-4016-9043-b56fdc37772f","Type":"ContainerStarted","Data":"dc1102c6133efac7a40586ee7b3f808d89b6430087e56e9c3a3b7ed270932316"} Jan 27 15:52:54 crc kubenswrapper[4713]: I0127 15:52:54.771485 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:54 crc kubenswrapper[4713]: I0127 15:52:54.776144 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:52:54 crc kubenswrapper[4713]: I0127 15:52:54.792413 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" podStartSLOduration=4.792390006 podStartE2EDuration="4.792390006s" podCreationTimestamp="2026-01-27 15:52:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:52:54.791222532 +0000 UTC m=+342.569432510" watchObservedRunningTime="2026-01-27 15:52:54.792390006 +0000 UTC m=+342.570599944" Jan 27 15:52:54 crc kubenswrapper[4713]: I0127 15:52:54.907460 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="70f60468-7829-429c-b61f-fae649eb5824" path="/var/lib/kubelet/pods/70f60468-7829-429c-b61f-fae649eb5824/volumes" Jan 27 15:53:12 crc kubenswrapper[4713]: I0127 15:53:12.555003 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:53:12 crc kubenswrapper[4713]: I0127 15:53:12.555633 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.015729 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b8f648b64-pxsrf"] Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.016920 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" podUID="5cb467cf-40ea-4016-9043-b56fdc37772f" containerName="controller-manager" containerID="cri-o://4c8158383a68bc604bb9f45891ac722e40ade5cd3c701ea6efa4e9b32ccfd045" gracePeriod=30 Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.442631 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.615325 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jtlj\" (UniqueName: \"kubernetes.io/projected/5cb467cf-40ea-4016-9043-b56fdc37772f-kube-api-access-4jtlj\") pod \"5cb467cf-40ea-4016-9043-b56fdc37772f\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.615390 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb467cf-40ea-4016-9043-b56fdc37772f-serving-cert\") pod \"5cb467cf-40ea-4016-9043-b56fdc37772f\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.615483 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-config\") pod \"5cb467cf-40ea-4016-9043-b56fdc37772f\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.615582 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-proxy-ca-bundles\") pod \"5cb467cf-40ea-4016-9043-b56fdc37772f\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.615614 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-client-ca\") pod \"5cb467cf-40ea-4016-9043-b56fdc37772f\" (UID: \"5cb467cf-40ea-4016-9043-b56fdc37772f\") " Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.616691 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-client-ca" (OuterVolumeSpecName: "client-ca") pod "5cb467cf-40ea-4016-9043-b56fdc37772f" (UID: "5cb467cf-40ea-4016-9043-b56fdc37772f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.616783 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-config" (OuterVolumeSpecName: "config") pod "5cb467cf-40ea-4016-9043-b56fdc37772f" (UID: "5cb467cf-40ea-4016-9043-b56fdc37772f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.617060 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5cb467cf-40ea-4016-9043-b56fdc37772f" (UID: "5cb467cf-40ea-4016-9043-b56fdc37772f"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.624976 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5cb467cf-40ea-4016-9043-b56fdc37772f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5cb467cf-40ea-4016-9043-b56fdc37772f" (UID: "5cb467cf-40ea-4016-9043-b56fdc37772f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.625162 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cb467cf-40ea-4016-9043-b56fdc37772f-kube-api-access-4jtlj" (OuterVolumeSpecName: "kube-api-access-4jtlj") pod "5cb467cf-40ea-4016-9043-b56fdc37772f" (UID: "5cb467cf-40ea-4016-9043-b56fdc37772f"). InnerVolumeSpecName "kube-api-access-4jtlj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.717639 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jtlj\" (UniqueName: \"kubernetes.io/projected/5cb467cf-40ea-4016-9043-b56fdc37772f-kube-api-access-4jtlj\") on node \"crc\" DevicePath \"\"" Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.717990 4713 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5cb467cf-40ea-4016-9043-b56fdc37772f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.718003 4713 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.718013 4713 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:53:38 crc kubenswrapper[4713]: I0127 15:53:38.718023 4713 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5cb467cf-40ea-4016-9043-b56fdc37772f-client-ca\") on node \"crc\" DevicePath \"\"" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.050594 4713 generic.go:334] "Generic (PLEG): container finished" podID="5cb467cf-40ea-4016-9043-b56fdc37772f" containerID="4c8158383a68bc604bb9f45891ac722e40ade5cd3c701ea6efa4e9b32ccfd045" exitCode=0 Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.050869 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" event={"ID":"5cb467cf-40ea-4016-9043-b56fdc37772f","Type":"ContainerDied","Data":"4c8158383a68bc604bb9f45891ac722e40ade5cd3c701ea6efa4e9b32ccfd045"} Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.050920 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" event={"ID":"5cb467cf-40ea-4016-9043-b56fdc37772f","Type":"ContainerDied","Data":"dc1102c6133efac7a40586ee7b3f808d89b6430087e56e9c3a3b7ed270932316"} Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.050916 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5b8f648b64-pxsrf" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.050943 4713 scope.go:117] "RemoveContainer" containerID="4c8158383a68bc604bb9f45891ac722e40ade5cd3c701ea6efa4e9b32ccfd045" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.068988 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5b8f648b64-pxsrf"] Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.079156 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5b8f648b64-pxsrf"] Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.082156 4713 scope.go:117] "RemoveContainer" containerID="4c8158383a68bc604bb9f45891ac722e40ade5cd3c701ea6efa4e9b32ccfd045" Jan 27 15:53:39 crc kubenswrapper[4713]: E0127 15:53:39.083339 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c8158383a68bc604bb9f45891ac722e40ade5cd3c701ea6efa4e9b32ccfd045\": container with ID starting with 4c8158383a68bc604bb9f45891ac722e40ade5cd3c701ea6efa4e9b32ccfd045 not found: ID does not exist" containerID="4c8158383a68bc604bb9f45891ac722e40ade5cd3c701ea6efa4e9b32ccfd045" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.083404 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c8158383a68bc604bb9f45891ac722e40ade5cd3c701ea6efa4e9b32ccfd045"} err="failed to get container status \"4c8158383a68bc604bb9f45891ac722e40ade5cd3c701ea6efa4e9b32ccfd045\": rpc error: code = NotFound desc = could not find container \"4c8158383a68bc604bb9f45891ac722e40ade5cd3c701ea6efa4e9b32ccfd045\": container with ID starting with 4c8158383a68bc604bb9f45891ac722e40ade5cd3c701ea6efa4e9b32ccfd045 not found: ID does not exist" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.330133 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm"] Jan 27 15:53:39 crc kubenswrapper[4713]: E0127 15:53:39.330750 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cb467cf-40ea-4016-9043-b56fdc37772f" containerName="controller-manager" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.330833 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cb467cf-40ea-4016-9043-b56fdc37772f" containerName="controller-manager" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.331029 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cb467cf-40ea-4016-9043-b56fdc37772f" containerName="controller-manager" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.332403 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.335558 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.335773 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.335816 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.335902 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.336029 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.336095 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.350105 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.354923 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm"] Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.530004 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-serving-cert\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.530298 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-config\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.530350 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-proxy-ca-bundles\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.530438 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-client-ca\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.530480 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkrrn\" (UniqueName: \"kubernetes.io/projected/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-kube-api-access-jkrrn\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.631769 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-proxy-ca-bundles\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.631878 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-client-ca\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.631924 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkrrn\" (UniqueName: \"kubernetes.io/projected/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-kube-api-access-jkrrn\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.631967 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-serving-cert\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.631990 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-config\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.633507 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-proxy-ca-bundles\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.633691 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-config\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.633710 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-client-ca\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.640088 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-serving-cert\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.658463 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkrrn\" (UniqueName: \"kubernetes.io/projected/fc1b7f65-fea0-415b-9d62-35a6fd096e1d-kube-api-access-jkrrn\") pod \"controller-manager-664c7c7bbc-xcfkm\" (UID: \"fc1b7f65-fea0-415b-9d62-35a6fd096e1d\") " pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.661017 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:39 crc kubenswrapper[4713]: I0127 15:53:39.918943 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm"] Jan 27 15:53:40 crc kubenswrapper[4713]: I0127 15:53:40.060174 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" event={"ID":"fc1b7f65-fea0-415b-9d62-35a6fd096e1d","Type":"ContainerStarted","Data":"9069db24940e9e5ec502c147b1ec7afbee1855cb8be4809b0f62a89b92491bdb"} Jan 27 15:53:40 crc kubenswrapper[4713]: I0127 15:53:40.909812 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cb467cf-40ea-4016-9043-b56fdc37772f" path="/var/lib/kubelet/pods/5cb467cf-40ea-4016-9043-b56fdc37772f/volumes" Jan 27 15:53:41 crc kubenswrapper[4713]: I0127 15:53:41.070090 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" event={"ID":"fc1b7f65-fea0-415b-9d62-35a6fd096e1d","Type":"ContainerStarted","Data":"32433faae443b1fb80ea1d055547a86892c455390c1636fc3cb21fbee1493961"} Jan 27 15:53:41 crc kubenswrapper[4713]: I0127 15:53:41.070368 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:41 crc kubenswrapper[4713]: I0127 15:53:41.075066 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" Jan 27 15:53:41 crc kubenswrapper[4713]: I0127 15:53:41.095145 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-664c7c7bbc-xcfkm" podStartSLOduration=3.095119692 podStartE2EDuration="3.095119692s" podCreationTimestamp="2026-01-27 15:53:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:53:41.090229319 +0000 UTC m=+388.868439267" watchObservedRunningTime="2026-01-27 15:53:41.095119692 +0000 UTC m=+388.873329630" Jan 27 15:53:42 crc kubenswrapper[4713]: I0127 15:53:42.554817 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:53:42 crc kubenswrapper[4713]: I0127 15:53:42.555292 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:54:12 crc kubenswrapper[4713]: I0127 15:54:12.554652 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:54:12 crc kubenswrapper[4713]: I0127 15:54:12.555648 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:54:12 crc kubenswrapper[4713]: I0127 15:54:12.555722 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:54:12 crc kubenswrapper[4713]: I0127 15:54:12.556671 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4f76f2a78766e1dd9aa4ea8bd3b2724be1d207112925beb0d541bde19499bda"} pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:54:12 crc kubenswrapper[4713]: I0127 15:54:12.556754 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" containerID="cri-o://b4f76f2a78766e1dd9aa4ea8bd3b2724be1d207112925beb0d541bde19499bda" gracePeriod=600 Jan 27 15:54:13 crc kubenswrapper[4713]: I0127 15:54:13.269343 4713 generic.go:334] "Generic (PLEG): container finished" podID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerID="b4f76f2a78766e1dd9aa4ea8bd3b2724be1d207112925beb0d541bde19499bda" exitCode=0 Jan 27 15:54:13 crc kubenswrapper[4713]: I0127 15:54:13.269425 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerDied","Data":"b4f76f2a78766e1dd9aa4ea8bd3b2724be1d207112925beb0d541bde19499bda"} Jan 27 15:54:13 crc kubenswrapper[4713]: I0127 15:54:13.269809 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerStarted","Data":"fb203d4d7d03d6fc61ad68abfdb43cfb6ffef5fcf6fbd60deeab9e3acb4b4835"} Jan 27 15:54:13 crc kubenswrapper[4713]: I0127 15:54:13.269839 4713 scope.go:117] "RemoveContainer" containerID="6a0e6eadd9b34e5a8624b89638c6e8466dd82cbe4cc4681d91bd0e54df9db85c" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.368713 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xs9tk"] Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.370132 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="nbdb" containerID="cri-o://40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b" gracePeriod=30 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.370772 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="sbdb" containerID="cri-o://ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c" gracePeriod=30 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.371225 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75" gracePeriod=30 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.371295 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="northd" containerID="cri-o://674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710" gracePeriod=30 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.371354 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="kube-rbac-proxy-node" containerID="cri-o://e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6" gracePeriod=30 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.371417 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovn-acl-logging" containerID="cri-o://36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd" gracePeriod=30 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.371751 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovn-controller" containerID="cri-o://89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128" gracePeriod=30 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.418180 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" containerID="cri-o://559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953" gracePeriod=30 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.740930 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/3.log" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.743438 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovn-acl-logging/0.log" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.744000 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovn-controller/0.log" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.744566 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804133 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qdzfz"] Jan 27 15:55:52 crc kubenswrapper[4713]: E0127 15:55:52.804415 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804433 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: E0127 15:55:52.804445 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="kube-rbac-proxy-node" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804454 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="kube-rbac-proxy-node" Jan 27 15:55:52 crc kubenswrapper[4713]: E0127 15:55:52.804465 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804474 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 15:55:52 crc kubenswrapper[4713]: E0127 15:55:52.804486 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovn-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804494 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovn-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: E0127 15:55:52.804511 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804519 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: E0127 15:55:52.804533 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="kubecfg-setup" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804541 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="kubecfg-setup" Jan 27 15:55:52 crc kubenswrapper[4713]: E0127 15:55:52.804554 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="sbdb" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804563 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="sbdb" Jan 27 15:55:52 crc kubenswrapper[4713]: E0127 15:55:52.804577 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovn-acl-logging" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804585 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovn-acl-logging" Jan 27 15:55:52 crc kubenswrapper[4713]: E0127 15:55:52.804598 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="nbdb" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804606 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="nbdb" Jan 27 15:55:52 crc kubenswrapper[4713]: E0127 15:55:52.804617 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="northd" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804624 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="northd" Jan 27 15:55:52 crc kubenswrapper[4713]: E0127 15:55:52.804640 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804648 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: E0127 15:55:52.804657 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804665 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804778 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="sbdb" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804793 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804804 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804816 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="kube-rbac-proxy-node" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804827 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovn-acl-logging" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804839 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="kube-rbac-proxy-ovn-metrics" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804851 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovn-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804866 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="nbdb" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804875 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.804887 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="northd" Jan 27 15:55:52 crc kubenswrapper[4713]: E0127 15:55:52.805002 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.805012 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.805153 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.805389 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerName="ovnkube-controller" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.807258 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.916939 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-log-socket\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.916992 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-run-netns\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917020 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-slash\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917088 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-env-overrides\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917088 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-log-socket" (OuterVolumeSpecName: "log-socket") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917109 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-ovn\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917136 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-systemd-units\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917138 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-slash" (OuterVolumeSpecName: "host-slash") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917158 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-var-lib-openvswitch\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917161 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917180 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917212 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-ovnkube-script-lib\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917252 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/131f5d56-4900-4558-abfa-24c9e999e5ad-ovn-node-metrics-cert\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917292 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-run-ovn-kubernetes\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917327 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-ovnkube-config\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917322 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917369 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-kubelet\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917377 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917466 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917457 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917498 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-cni-netd\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917520 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-cni-bin\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917534 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917557 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917565 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-systemd\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917586 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-var-lib-cni-networks-ovn-kubernetes\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917574 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917607 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-openvswitch\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917643 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-etc-openvswitch\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917653 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917671 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5w9kd\" (UniqueName: \"kubernetes.io/projected/131f5d56-4900-4558-abfa-24c9e999e5ad-kube-api-access-5w9kd\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917686 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917710 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917733 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-node-log" (OuterVolumeSpecName: "node-log") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917710 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-node-log\") pod \"131f5d56-4900-4558-abfa-24c9e999e5ad\" (UID: \"131f5d56-4900-4558-abfa-24c9e999e5ad\") " Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917867 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917912 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b5e37a90-410a-48d2-8a28-87fba116541e-env-overrides\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917953 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-var-lib-openvswitch\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.917979 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-etc-openvswitch\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.918103 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.918953 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-run-netns\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.918994 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919015 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-run-ovn\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919051 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-kubelet\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919128 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-cni-netd\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919240 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b5e37a90-410a-48d2-8a28-87fba116541e-ovnkube-script-lib\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919289 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b5e37a90-410a-48d2-8a28-87fba116541e-ovnkube-config\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919318 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-log-socket\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919461 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-slash\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919526 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b5e37a90-410a-48d2-8a28-87fba116541e-ovn-node-metrics-cert\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919567 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-node-log\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919611 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pjjw7\" (UniqueName: \"kubernetes.io/projected/b5e37a90-410a-48d2-8a28-87fba116541e-kube-api-access-pjjw7\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919702 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-run-openvswitch\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919734 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-run-systemd\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919756 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-run-ovn-kubernetes\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919816 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-systemd-units\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919851 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-cni-bin\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919931 4713 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919944 4713 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919954 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919964 4713 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919976 4713 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919985 4713 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.919997 4713 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.920008 4713 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.920017 4713 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.920028 4713 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.920058 4713 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.920074 4713 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-node-log\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.920086 4713 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-log-socket\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.920097 4713 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.920108 4713 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-host-slash\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.920119 4713 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/131f5d56-4900-4558-abfa-24c9e999e5ad-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.920128 4713 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.924693 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/131f5d56-4900-4558-abfa-24c9e999e5ad-kube-api-access-5w9kd" (OuterVolumeSpecName: "kube-api-access-5w9kd") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "kube-api-access-5w9kd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.925006 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/131f5d56-4900-4558-abfa-24c9e999e5ad-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.927379 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovnkube-controller/3.log" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.929840 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovn-acl-logging/0.log" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.930340 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xs9tk_131f5d56-4900-4558-abfa-24c9e999e5ad/ovn-controller/0.log" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.930701 4713 generic.go:334] "Generic (PLEG): container finished" podID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerID="559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953" exitCode=0 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.930801 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.930812 4713 generic.go:334] "Generic (PLEG): container finished" podID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerID="ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c" exitCode=0 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.930926 4713 generic.go:334] "Generic (PLEG): container finished" podID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerID="40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b" exitCode=0 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.930987 4713 generic.go:334] "Generic (PLEG): container finished" podID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerID="674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710" exitCode=0 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931090 4713 generic.go:334] "Generic (PLEG): container finished" podID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerID="2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75" exitCode=0 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931151 4713 generic.go:334] "Generic (PLEG): container finished" podID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerID="e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6" exitCode=0 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931200 4713 generic.go:334] "Generic (PLEG): container finished" podID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerID="36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd" exitCode=143 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931253 4713 generic.go:334] "Generic (PLEG): container finished" podID="131f5d56-4900-4558-abfa-24c9e999e5ad" containerID="89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128" exitCode=143 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.930777 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerDied","Data":"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931367 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerDied","Data":"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931399 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerDied","Data":"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931413 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerDied","Data":"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931422 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerDied","Data":"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931433 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerDied","Data":"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931445 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931461 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931467 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931474 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931480 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931485 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931490 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931497 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931503 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931497 4713 scope.go:117] "RemoveContainer" containerID="559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931511 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerDied","Data":"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931634 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931647 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931654 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931661 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931667 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931674 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931682 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931690 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931696 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931703 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931714 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerDied","Data":"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931726 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931733 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931739 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931744 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931752 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931758 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931763 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931768 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931773 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931778 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931785 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xs9tk" event={"ID":"131f5d56-4900-4558-abfa-24c9e999e5ad","Type":"ContainerDied","Data":"e03bc59b04541a6766bd487d7d12b47cf4d7da2b78b65f45ff4c8b9f05c011d2"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931794 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931800 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931806 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931811 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931816 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931823 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931828 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931833 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931839 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.931844 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.933874 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "131f5d56-4900-4558-abfa-24c9e999e5ad" (UID: "131f5d56-4900-4558-abfa-24c9e999e5ad"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.935373 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n7wxq_bf07c585-f90e-4416-a66c-d41547008320/kube-multus/2.log" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.936101 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n7wxq_bf07c585-f90e-4416-a66c-d41547008320/kube-multus/1.log" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.936155 4713 generic.go:334] "Generic (PLEG): container finished" podID="bf07c585-f90e-4416-a66c-d41547008320" containerID="8d44969cb30e758dfd5d789635414864d3e62f8dee75ba64ba6c62a4e8a4600a" exitCode=2 Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.936193 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n7wxq" event={"ID":"bf07c585-f90e-4416-a66c-d41547008320","Type":"ContainerDied","Data":"8d44969cb30e758dfd5d789635414864d3e62f8dee75ba64ba6c62a4e8a4600a"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.936223 4713 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"30964991c715fef1f0e7ee97a2a0bbbab1a671ddbd347113c96a318b76d91bfd"} Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.936674 4713 scope.go:117] "RemoveContainer" containerID="8d44969cb30e758dfd5d789635414864d3e62f8dee75ba64ba6c62a4e8a4600a" Jan 27 15:55:52 crc kubenswrapper[4713]: E0127 15:55:52.936881 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-n7wxq_openshift-multus(bf07c585-f90e-4416-a66c-d41547008320)\"" pod="openshift-multus/multus-n7wxq" podUID="bf07c585-f90e-4416-a66c-d41547008320" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.952184 4713 scope.go:117] "RemoveContainer" containerID="e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.972216 4713 scope.go:117] "RemoveContainer" containerID="ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.985871 4713 scope.go:117] "RemoveContainer" containerID="40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b" Jan 27 15:55:52 crc kubenswrapper[4713]: I0127 15:55:52.999230 4713 scope.go:117] "RemoveContainer" containerID="674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.012524 4713 scope.go:117] "RemoveContainer" containerID="2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.021618 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b5e37a90-410a-48d2-8a28-87fba116541e-ovnkube-script-lib\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.021688 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b5e37a90-410a-48d2-8a28-87fba116541e-ovnkube-config\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.021740 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-log-socket\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.021768 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-slash\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.021811 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-node-log\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.021833 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b5e37a90-410a-48d2-8a28-87fba116541e-ovn-node-metrics-cert\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.021866 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pjjw7\" (UniqueName: \"kubernetes.io/projected/b5e37a90-410a-48d2-8a28-87fba116541e-kube-api-access-pjjw7\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.021924 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-run-openvswitch\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.021944 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-run-systemd\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.021967 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-log-socket\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.021985 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-run-ovn-kubernetes\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022020 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-run-ovn-kubernetes\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.021998 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-slash\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022083 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-systemd-units\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022093 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-run-openvswitch\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022017 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-node-log\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022056 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-systemd-units\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022160 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-run-systemd\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022194 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-cni-bin\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022252 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b5e37a90-410a-48d2-8a28-87fba116541e-env-overrides\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022295 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-cni-bin\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022338 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-var-lib-openvswitch\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022391 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-etc-openvswitch\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022416 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-run-netns\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022478 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-var-lib-openvswitch\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022516 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-etc-openvswitch\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022522 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b5e37a90-410a-48d2-8a28-87fba116541e-ovnkube-script-lib\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022556 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022597 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-run-ovn\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022612 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022621 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-kubelet\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022646 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-cni-netd\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022642 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-run-ovn\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022647 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-run-netns\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022673 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b5e37a90-410a-48d2-8a28-87fba116541e-ovnkube-config\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022685 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-kubelet\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022725 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b5e37a90-410a-48d2-8a28-87fba116541e-host-cni-netd\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022805 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5w9kd\" (UniqueName: \"kubernetes.io/projected/131f5d56-4900-4558-abfa-24c9e999e5ad-kube-api-access-5w9kd\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022841 4713 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/131f5d56-4900-4558-abfa-24c9e999e5ad-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022857 4713 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/131f5d56-4900-4558-abfa-24c9e999e5ad-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.022858 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b5e37a90-410a-48d2-8a28-87fba116541e-env-overrides\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.027299 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b5e37a90-410a-48d2-8a28-87fba116541e-ovn-node-metrics-cert\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.029254 4713 scope.go:117] "RemoveContainer" containerID="e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.041856 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pjjw7\" (UniqueName: \"kubernetes.io/projected/b5e37a90-410a-48d2-8a28-87fba116541e-kube-api-access-pjjw7\") pod \"ovnkube-node-qdzfz\" (UID: \"b5e37a90-410a-48d2-8a28-87fba116541e\") " pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.044391 4713 scope.go:117] "RemoveContainer" containerID="36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.059011 4713 scope.go:117] "RemoveContainer" containerID="89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.073375 4713 scope.go:117] "RemoveContainer" containerID="e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.088211 4713 scope.go:117] "RemoveContainer" containerID="559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953" Jan 27 15:55:53 crc kubenswrapper[4713]: E0127 15:55:53.088634 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953\": container with ID starting with 559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953 not found: ID does not exist" containerID="559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.088670 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953"} err="failed to get container status \"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953\": rpc error: code = NotFound desc = could not find container \"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953\": container with ID starting with 559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.088698 4713 scope.go:117] "RemoveContainer" containerID="e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10" Jan 27 15:55:53 crc kubenswrapper[4713]: E0127 15:55:53.089061 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10\": container with ID starting with e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10 not found: ID does not exist" containerID="e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.089095 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10"} err="failed to get container status \"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10\": rpc error: code = NotFound desc = could not find container \"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10\": container with ID starting with e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.089113 4713 scope.go:117] "RemoveContainer" containerID="ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c" Jan 27 15:55:53 crc kubenswrapper[4713]: E0127 15:55:53.089468 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\": container with ID starting with ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c not found: ID does not exist" containerID="ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.089497 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c"} err="failed to get container status \"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\": rpc error: code = NotFound desc = could not find container \"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\": container with ID starting with ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.089513 4713 scope.go:117] "RemoveContainer" containerID="40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b" Jan 27 15:55:53 crc kubenswrapper[4713]: E0127 15:55:53.089756 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\": container with ID starting with 40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b not found: ID does not exist" containerID="40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.089775 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b"} err="failed to get container status \"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\": rpc error: code = NotFound desc = could not find container \"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\": container with ID starting with 40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.089788 4713 scope.go:117] "RemoveContainer" containerID="674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710" Jan 27 15:55:53 crc kubenswrapper[4713]: E0127 15:55:53.090189 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\": container with ID starting with 674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710 not found: ID does not exist" containerID="674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.090208 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710"} err="failed to get container status \"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\": rpc error: code = NotFound desc = could not find container \"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\": container with ID starting with 674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.090220 4713 scope.go:117] "RemoveContainer" containerID="2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75" Jan 27 15:55:53 crc kubenswrapper[4713]: E0127 15:55:53.090666 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\": container with ID starting with 2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75 not found: ID does not exist" containerID="2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.090685 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75"} err="failed to get container status \"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\": rpc error: code = NotFound desc = could not find container \"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\": container with ID starting with 2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.090700 4713 scope.go:117] "RemoveContainer" containerID="e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6" Jan 27 15:55:53 crc kubenswrapper[4713]: E0127 15:55:53.090941 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\": container with ID starting with e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6 not found: ID does not exist" containerID="e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.090969 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6"} err="failed to get container status \"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\": rpc error: code = NotFound desc = could not find container \"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\": container with ID starting with e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.090988 4713 scope.go:117] "RemoveContainer" containerID="36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd" Jan 27 15:55:53 crc kubenswrapper[4713]: E0127 15:55:53.091369 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\": container with ID starting with 36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd not found: ID does not exist" containerID="36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.091390 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd"} err="failed to get container status \"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\": rpc error: code = NotFound desc = could not find container \"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\": container with ID starting with 36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.091429 4713 scope.go:117] "RemoveContainer" containerID="89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128" Jan 27 15:55:53 crc kubenswrapper[4713]: E0127 15:55:53.091703 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\": container with ID starting with 89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128 not found: ID does not exist" containerID="89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.091722 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128"} err="failed to get container status \"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\": rpc error: code = NotFound desc = could not find container \"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\": container with ID starting with 89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.091756 4713 scope.go:117] "RemoveContainer" containerID="e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da" Jan 27 15:55:53 crc kubenswrapper[4713]: E0127 15:55:53.092080 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\": container with ID starting with e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da not found: ID does not exist" containerID="e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.092100 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da"} err="failed to get container status \"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\": rpc error: code = NotFound desc = could not find container \"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\": container with ID starting with e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.092155 4713 scope.go:117] "RemoveContainer" containerID="559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.092485 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953"} err="failed to get container status \"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953\": rpc error: code = NotFound desc = could not find container \"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953\": container with ID starting with 559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.092502 4713 scope.go:117] "RemoveContainer" containerID="e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.092727 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10"} err="failed to get container status \"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10\": rpc error: code = NotFound desc = could not find container \"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10\": container with ID starting with e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.092743 4713 scope.go:117] "RemoveContainer" containerID="ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.092955 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c"} err="failed to get container status \"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\": rpc error: code = NotFound desc = could not find container \"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\": container with ID starting with ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.092973 4713 scope.go:117] "RemoveContainer" containerID="40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.093285 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b"} err="failed to get container status \"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\": rpc error: code = NotFound desc = could not find container \"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\": container with ID starting with 40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.093301 4713 scope.go:117] "RemoveContainer" containerID="674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.093500 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710"} err="failed to get container status \"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\": rpc error: code = NotFound desc = could not find container \"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\": container with ID starting with 674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.093515 4713 scope.go:117] "RemoveContainer" containerID="2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.093721 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75"} err="failed to get container status \"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\": rpc error: code = NotFound desc = could not find container \"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\": container with ID starting with 2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.093739 4713 scope.go:117] "RemoveContainer" containerID="e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.093915 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6"} err="failed to get container status \"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\": rpc error: code = NotFound desc = could not find container \"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\": container with ID starting with e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.093930 4713 scope.go:117] "RemoveContainer" containerID="36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.094340 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd"} err="failed to get container status \"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\": rpc error: code = NotFound desc = could not find container \"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\": container with ID starting with 36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.094356 4713 scope.go:117] "RemoveContainer" containerID="89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.094688 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128"} err="failed to get container status \"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\": rpc error: code = NotFound desc = could not find container \"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\": container with ID starting with 89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.094704 4713 scope.go:117] "RemoveContainer" containerID="e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.094960 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da"} err="failed to get container status \"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\": rpc error: code = NotFound desc = could not find container \"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\": container with ID starting with e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.094980 4713 scope.go:117] "RemoveContainer" containerID="559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.095346 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953"} err="failed to get container status \"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953\": rpc error: code = NotFound desc = could not find container \"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953\": container with ID starting with 559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.095365 4713 scope.go:117] "RemoveContainer" containerID="e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.095679 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10"} err="failed to get container status \"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10\": rpc error: code = NotFound desc = could not find container \"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10\": container with ID starting with e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.095696 4713 scope.go:117] "RemoveContainer" containerID="ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.095895 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c"} err="failed to get container status \"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\": rpc error: code = NotFound desc = could not find container \"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\": container with ID starting with ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.095912 4713 scope.go:117] "RemoveContainer" containerID="40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.096135 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b"} err="failed to get container status \"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\": rpc error: code = NotFound desc = could not find container \"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\": container with ID starting with 40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.096152 4713 scope.go:117] "RemoveContainer" containerID="674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.096345 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710"} err="failed to get container status \"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\": rpc error: code = NotFound desc = could not find container \"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\": container with ID starting with 674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.096361 4713 scope.go:117] "RemoveContainer" containerID="2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.096613 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75"} err="failed to get container status \"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\": rpc error: code = NotFound desc = could not find container \"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\": container with ID starting with 2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.096627 4713 scope.go:117] "RemoveContainer" containerID="e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.096881 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6"} err="failed to get container status \"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\": rpc error: code = NotFound desc = could not find container \"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\": container with ID starting with e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.096895 4713 scope.go:117] "RemoveContainer" containerID="36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.097072 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd"} err="failed to get container status \"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\": rpc error: code = NotFound desc = could not find container \"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\": container with ID starting with 36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.097099 4713 scope.go:117] "RemoveContainer" containerID="89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.097368 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128"} err="failed to get container status \"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\": rpc error: code = NotFound desc = could not find container \"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\": container with ID starting with 89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.097385 4713 scope.go:117] "RemoveContainer" containerID="e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.097589 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da"} err="failed to get container status \"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\": rpc error: code = NotFound desc = could not find container \"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\": container with ID starting with e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.097616 4713 scope.go:117] "RemoveContainer" containerID="559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.097770 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953"} err="failed to get container status \"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953\": rpc error: code = NotFound desc = could not find container \"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953\": container with ID starting with 559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.097785 4713 scope.go:117] "RemoveContainer" containerID="e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.097969 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10"} err="failed to get container status \"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10\": rpc error: code = NotFound desc = could not find container \"e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10\": container with ID starting with e4200af4785913622c8548a2c909a2a45849192f84ee6596ba7067584478af10 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.097985 4713 scope.go:117] "RemoveContainer" containerID="ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.098283 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c"} err="failed to get container status \"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\": rpc error: code = NotFound desc = could not find container \"ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c\": container with ID starting with ee181cae3c92798870e0b751cdeef4a83d93a74e228c2e117f737e307800ab8c not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.098301 4713 scope.go:117] "RemoveContainer" containerID="40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.098541 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b"} err="failed to get container status \"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\": rpc error: code = NotFound desc = could not find container \"40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b\": container with ID starting with 40471637bb727190d245e78aa62b78aee446224a012f6ccca775f28aeb9d1f0b not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.098559 4713 scope.go:117] "RemoveContainer" containerID="674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.098832 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710"} err="failed to get container status \"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\": rpc error: code = NotFound desc = could not find container \"674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710\": container with ID starting with 674adcfbc9a6148042546f105f07dc97f215b037aa8ceb920d1facb6b0404710 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.098849 4713 scope.go:117] "RemoveContainer" containerID="2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.099007 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75"} err="failed to get container status \"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\": rpc error: code = NotFound desc = could not find container \"2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75\": container with ID starting with 2a51c221f0226903230d8504264e6b47bd53d240c911acc12523947b25911b75 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.099023 4713 scope.go:117] "RemoveContainer" containerID="e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.099214 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6"} err="failed to get container status \"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\": rpc error: code = NotFound desc = could not find container \"e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6\": container with ID starting with e9f267fb6dc00703a36499bca41bd8112793130c4a80fc09bb57a28720113aa6 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.099231 4713 scope.go:117] "RemoveContainer" containerID="36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.101361 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd"} err="failed to get container status \"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\": rpc error: code = NotFound desc = could not find container \"36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd\": container with ID starting with 36da78e593bfc6d5383e0941ebe152123d7a3e5ef39f542adb6654f0ca5e7ecd not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.101422 4713 scope.go:117] "RemoveContainer" containerID="89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.101761 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128"} err="failed to get container status \"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\": rpc error: code = NotFound desc = could not find container \"89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128\": container with ID starting with 89eeb9c953a18c23ef7118650eea681bc1eaa38e8a4c701910b0baf563187128 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.101784 4713 scope.go:117] "RemoveContainer" containerID="e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.102021 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da"} err="failed to get container status \"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\": rpc error: code = NotFound desc = could not find container \"e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da\": container with ID starting with e4acbc7be6d67a24b44e6bc095a08ab1512aa79d5d49060076c9dd45fc65b7da not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.102056 4713 scope.go:117] "RemoveContainer" containerID="559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.102282 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953"} err="failed to get container status \"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953\": rpc error: code = NotFound desc = could not find container \"559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953\": container with ID starting with 559320d5eca18f4e55794b395bc63356c05f7a54091533f8498545200034a953 not found: ID does not exist" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.127639 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.264963 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xs9tk"] Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.271525 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xs9tk"] Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.948387 4713 generic.go:334] "Generic (PLEG): container finished" podID="b5e37a90-410a-48d2-8a28-87fba116541e" containerID="76e99d80183de4bb9c6812821b69583dfa5ed1d9a05ee13340254020d6950328" exitCode=0 Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.948454 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" event={"ID":"b5e37a90-410a-48d2-8a28-87fba116541e","Type":"ContainerDied","Data":"76e99d80183de4bb9c6812821b69583dfa5ed1d9a05ee13340254020d6950328"} Jan 27 15:55:53 crc kubenswrapper[4713]: I0127 15:55:53.948553 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" event={"ID":"b5e37a90-410a-48d2-8a28-87fba116541e","Type":"ContainerStarted","Data":"5bedbb6b87b5a3073c2eaad6071968f55552e5e178333fd78c0f5ea892704a0b"} Jan 27 15:55:54 crc kubenswrapper[4713]: I0127 15:55:54.907175 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="131f5d56-4900-4558-abfa-24c9e999e5ad" path="/var/lib/kubelet/pods/131f5d56-4900-4558-abfa-24c9e999e5ad/volumes" Jan 27 15:55:54 crc kubenswrapper[4713]: I0127 15:55:54.958242 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" event={"ID":"b5e37a90-410a-48d2-8a28-87fba116541e","Type":"ContainerStarted","Data":"70452110676ca3109a2c12adc2759cdc8c3c593eb94d6c8c36c6f287332b551a"} Jan 27 15:55:54 crc kubenswrapper[4713]: I0127 15:55:54.958322 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" event={"ID":"b5e37a90-410a-48d2-8a28-87fba116541e","Type":"ContainerStarted","Data":"f02fa3223450cc56d31032b7899da1d121a01081e9322300935f99eddf746ff0"} Jan 27 15:55:54 crc kubenswrapper[4713]: I0127 15:55:54.958336 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" event={"ID":"b5e37a90-410a-48d2-8a28-87fba116541e","Type":"ContainerStarted","Data":"4471d7234e1bb8887934d1a00f28f0f53102e39c0fb7e2cb87e5dcc7448be4d5"} Jan 27 15:55:54 crc kubenswrapper[4713]: I0127 15:55:54.958348 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" event={"ID":"b5e37a90-410a-48d2-8a28-87fba116541e","Type":"ContainerStarted","Data":"31f787b3c3f4fd21fa623b8e6eddfdb2651a333e044823e8056854f78edb76c7"} Jan 27 15:55:54 crc kubenswrapper[4713]: I0127 15:55:54.958359 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" event={"ID":"b5e37a90-410a-48d2-8a28-87fba116541e","Type":"ContainerStarted","Data":"fb72288833da00b7cd91d86048460e74399a554734be4206482899ff25f3241b"} Jan 27 15:55:54 crc kubenswrapper[4713]: I0127 15:55:54.958370 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" event={"ID":"b5e37a90-410a-48d2-8a28-87fba116541e","Type":"ContainerStarted","Data":"96765e24ddf29df31453557cd4eaaf2831c8ddeca8e39ddcc21faa08c7a89cdf"} Jan 27 15:55:56 crc kubenswrapper[4713]: I0127 15:55:56.974294 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" event={"ID":"b5e37a90-410a-48d2-8a28-87fba116541e","Type":"ContainerStarted","Data":"092ac13cd8478698e570121de4284af5c73c47d1e5afa7cc6290880b49ec13b4"} Jan 27 15:55:59 crc kubenswrapper[4713]: I0127 15:55:59.996062 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" event={"ID":"b5e37a90-410a-48d2-8a28-87fba116541e","Type":"ContainerStarted","Data":"32435631ca7ebffee6b7bfc646bf24891dc981b4cbb7bd6d07362ee94eaffed0"} Jan 27 15:55:59 crc kubenswrapper[4713]: I0127 15:55:59.996896 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:59 crc kubenswrapper[4713]: I0127 15:55:59.996919 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:55:59 crc kubenswrapper[4713]: I0127 15:55:59.996931 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:56:00 crc kubenswrapper[4713]: I0127 15:56:00.029877 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:56:00 crc kubenswrapper[4713]: I0127 15:56:00.030900 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:56:00 crc kubenswrapper[4713]: I0127 15:56:00.043350 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" podStartSLOduration=8.043322298 podStartE2EDuration="8.043322298s" podCreationTimestamp="2026-01-27 15:55:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:56:00.038396146 +0000 UTC m=+527.816606094" watchObservedRunningTime="2026-01-27 15:56:00.043322298 +0000 UTC m=+527.821532236" Jan 27 15:56:05 crc kubenswrapper[4713]: I0127 15:56:05.900000 4713 scope.go:117] "RemoveContainer" containerID="8d44969cb30e758dfd5d789635414864d3e62f8dee75ba64ba6c62a4e8a4600a" Jan 27 15:56:05 crc kubenswrapper[4713]: E0127 15:56:05.901353 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-n7wxq_openshift-multus(bf07c585-f90e-4416-a66c-d41547008320)\"" pod="openshift-multus/multus-n7wxq" podUID="bf07c585-f90e-4416-a66c-d41547008320" Jan 27 15:56:12 crc kubenswrapper[4713]: I0127 15:56:12.555006 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:56:12 crc kubenswrapper[4713]: I0127 15:56:12.555477 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:56:13 crc kubenswrapper[4713]: I0127 15:56:13.208951 4713 scope.go:117] "RemoveContainer" containerID="30964991c715fef1f0e7ee97a2a0bbbab1a671ddbd347113c96a318b76d91bfd" Jan 27 15:56:14 crc kubenswrapper[4713]: I0127 15:56:14.090633 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n7wxq_bf07c585-f90e-4416-a66c-d41547008320/kube-multus/2.log" Jan 27 15:56:20 crc kubenswrapper[4713]: I0127 15:56:20.899313 4713 scope.go:117] "RemoveContainer" containerID="8d44969cb30e758dfd5d789635414864d3e62f8dee75ba64ba6c62a4e8a4600a" Jan 27 15:56:21 crc kubenswrapper[4713]: I0127 15:56:21.140167 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-n7wxq_bf07c585-f90e-4416-a66c-d41547008320/kube-multus/2.log" Jan 27 15:56:21 crc kubenswrapper[4713]: I0127 15:56:21.140679 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-n7wxq" event={"ID":"bf07c585-f90e-4416-a66c-d41547008320","Type":"ContainerStarted","Data":"572ec24f7df131ee10ab02b1d6fbe16f91d0a5350f76206dc4d74c182b1da0f7"} Jan 27 15:56:23 crc kubenswrapper[4713]: I0127 15:56:23.162837 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qdzfz" Jan 27 15:56:42 crc kubenswrapper[4713]: I0127 15:56:42.555369 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:56:42 crc kubenswrapper[4713]: I0127 15:56:42.556382 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:57:12 crc kubenswrapper[4713]: I0127 15:57:12.554863 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:57:12 crc kubenswrapper[4713]: I0127 15:57:12.557453 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:57:12 crc kubenswrapper[4713]: I0127 15:57:12.557772 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 15:57:12 crc kubenswrapper[4713]: I0127 15:57:12.558660 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"fb203d4d7d03d6fc61ad68abfdb43cfb6ffef5fcf6fbd60deeab9e3acb4b4835"} pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 15:57:12 crc kubenswrapper[4713]: I0127 15:57:12.558853 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" containerID="cri-o://fb203d4d7d03d6fc61ad68abfdb43cfb6ffef5fcf6fbd60deeab9e3acb4b4835" gracePeriod=600 Jan 27 15:57:12 crc kubenswrapper[4713]: I0127 15:57:12.985832 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwtsb"] Jan 27 15:57:12 crc kubenswrapper[4713]: I0127 15:57:12.986944 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rwtsb" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" containerName="registry-server" containerID="cri-o://34bb448fbf100671c3ee9c0a027180de976c431c3a83cc1a58ef5820a21210c3" gracePeriod=30 Jan 27 15:57:13 crc kubenswrapper[4713]: I0127 15:57:13.031111 4713 generic.go:334] "Generic (PLEG): container finished" podID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerID="fb203d4d7d03d6fc61ad68abfdb43cfb6ffef5fcf6fbd60deeab9e3acb4b4835" exitCode=0 Jan 27 15:57:13 crc kubenswrapper[4713]: I0127 15:57:13.031169 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerDied","Data":"fb203d4d7d03d6fc61ad68abfdb43cfb6ffef5fcf6fbd60deeab9e3acb4b4835"} Jan 27 15:57:13 crc kubenswrapper[4713]: I0127 15:57:13.031205 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerStarted","Data":"ce9cf5c90b2ef4b5d8bbf232cb22333413a90aeb960b0273756421fdaf75fb8b"} Jan 27 15:57:13 crc kubenswrapper[4713]: I0127 15:57:13.031227 4713 scope.go:117] "RemoveContainer" containerID="b4f76f2a78766e1dd9aa4ea8bd3b2724be1d207112925beb0d541bde19499bda" Jan 27 15:57:13 crc kubenswrapper[4713]: I0127 15:57:13.353724 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:57:13 crc kubenswrapper[4713]: I0127 15:57:13.448507 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-utilities\") pod \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\" (UID: \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\") " Jan 27 15:57:13 crc kubenswrapper[4713]: I0127 15:57:13.448633 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-catalog-content\") pod \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\" (UID: \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\") " Jan 27 15:57:13 crc kubenswrapper[4713]: I0127 15:57:13.448749 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6nrd\" (UniqueName: \"kubernetes.io/projected/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-kube-api-access-t6nrd\") pod \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\" (UID: \"c255f079-b4c8-4c29-8e77-28e56c9a9ecf\") " Jan 27 15:57:13 crc kubenswrapper[4713]: I0127 15:57:13.461347 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-kube-api-access-t6nrd" (OuterVolumeSpecName: "kube-api-access-t6nrd") pod "c255f079-b4c8-4c29-8e77-28e56c9a9ecf" (UID: "c255f079-b4c8-4c29-8e77-28e56c9a9ecf"). InnerVolumeSpecName "kube-api-access-t6nrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:57:13 crc kubenswrapper[4713]: I0127 15:57:13.462448 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-utilities" (OuterVolumeSpecName: "utilities") pod "c255f079-b4c8-4c29-8e77-28e56c9a9ecf" (UID: "c255f079-b4c8-4c29-8e77-28e56c9a9ecf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:57:13 crc kubenswrapper[4713]: I0127 15:57:13.479348 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c255f079-b4c8-4c29-8e77-28e56c9a9ecf" (UID: "c255f079-b4c8-4c29-8e77-28e56c9a9ecf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:57:13 crc kubenswrapper[4713]: I0127 15:57:13.550867 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:13 crc kubenswrapper[4713]: I0127 15:57:13.550929 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:13 crc kubenswrapper[4713]: I0127 15:57:13.550976 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t6nrd\" (UniqueName: \"kubernetes.io/projected/c255f079-b4c8-4c29-8e77-28e56c9a9ecf-kube-api-access-t6nrd\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.039389 4713 generic.go:334] "Generic (PLEG): container finished" podID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" containerID="34bb448fbf100671c3ee9c0a027180de976c431c3a83cc1a58ef5820a21210c3" exitCode=0 Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.039483 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rwtsb" Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.039491 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwtsb" event={"ID":"c255f079-b4c8-4c29-8e77-28e56c9a9ecf","Type":"ContainerDied","Data":"34bb448fbf100671c3ee9c0a027180de976c431c3a83cc1a58ef5820a21210c3"} Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.039550 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rwtsb" event={"ID":"c255f079-b4c8-4c29-8e77-28e56c9a9ecf","Type":"ContainerDied","Data":"0357391f3717f9272955b2eae88630ae9751d3f8ff45378ea451fd31e4386759"} Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.039580 4713 scope.go:117] "RemoveContainer" containerID="34bb448fbf100671c3ee9c0a027180de976c431c3a83cc1a58ef5820a21210c3" Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.067681 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwtsb"] Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.072777 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rwtsb"] Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.074847 4713 scope.go:117] "RemoveContainer" containerID="ad5614e6b504f3300a4d45c971b7fb35d8f3ab75ef896d72d2e46d3b56ba6296" Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.107414 4713 scope.go:117] "RemoveContainer" containerID="6f04baf981f2a08d570dc6de725b7ccecfbfa9e098122ad9ce4342ea4925ede1" Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.127976 4713 scope.go:117] "RemoveContainer" containerID="34bb448fbf100671c3ee9c0a027180de976c431c3a83cc1a58ef5820a21210c3" Jan 27 15:57:14 crc kubenswrapper[4713]: E0127 15:57:14.128788 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34bb448fbf100671c3ee9c0a027180de976c431c3a83cc1a58ef5820a21210c3\": container with ID starting with 34bb448fbf100671c3ee9c0a027180de976c431c3a83cc1a58ef5820a21210c3 not found: ID does not exist" containerID="34bb448fbf100671c3ee9c0a027180de976c431c3a83cc1a58ef5820a21210c3" Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.128828 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34bb448fbf100671c3ee9c0a027180de976c431c3a83cc1a58ef5820a21210c3"} err="failed to get container status \"34bb448fbf100671c3ee9c0a027180de976c431c3a83cc1a58ef5820a21210c3\": rpc error: code = NotFound desc = could not find container \"34bb448fbf100671c3ee9c0a027180de976c431c3a83cc1a58ef5820a21210c3\": container with ID starting with 34bb448fbf100671c3ee9c0a027180de976c431c3a83cc1a58ef5820a21210c3 not found: ID does not exist" Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.128855 4713 scope.go:117] "RemoveContainer" containerID="ad5614e6b504f3300a4d45c971b7fb35d8f3ab75ef896d72d2e46d3b56ba6296" Jan 27 15:57:14 crc kubenswrapper[4713]: E0127 15:57:14.129404 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad5614e6b504f3300a4d45c971b7fb35d8f3ab75ef896d72d2e46d3b56ba6296\": container with ID starting with ad5614e6b504f3300a4d45c971b7fb35d8f3ab75ef896d72d2e46d3b56ba6296 not found: ID does not exist" containerID="ad5614e6b504f3300a4d45c971b7fb35d8f3ab75ef896d72d2e46d3b56ba6296" Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.129472 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad5614e6b504f3300a4d45c971b7fb35d8f3ab75ef896d72d2e46d3b56ba6296"} err="failed to get container status \"ad5614e6b504f3300a4d45c971b7fb35d8f3ab75ef896d72d2e46d3b56ba6296\": rpc error: code = NotFound desc = could not find container \"ad5614e6b504f3300a4d45c971b7fb35d8f3ab75ef896d72d2e46d3b56ba6296\": container with ID starting with ad5614e6b504f3300a4d45c971b7fb35d8f3ab75ef896d72d2e46d3b56ba6296 not found: ID does not exist" Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.129513 4713 scope.go:117] "RemoveContainer" containerID="6f04baf981f2a08d570dc6de725b7ccecfbfa9e098122ad9ce4342ea4925ede1" Jan 27 15:57:14 crc kubenswrapper[4713]: E0127 15:57:14.130084 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f04baf981f2a08d570dc6de725b7ccecfbfa9e098122ad9ce4342ea4925ede1\": container with ID starting with 6f04baf981f2a08d570dc6de725b7ccecfbfa9e098122ad9ce4342ea4925ede1 not found: ID does not exist" containerID="6f04baf981f2a08d570dc6de725b7ccecfbfa9e098122ad9ce4342ea4925ede1" Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.130113 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f04baf981f2a08d570dc6de725b7ccecfbfa9e098122ad9ce4342ea4925ede1"} err="failed to get container status \"6f04baf981f2a08d570dc6de725b7ccecfbfa9e098122ad9ce4342ea4925ede1\": rpc error: code = NotFound desc = could not find container \"6f04baf981f2a08d570dc6de725b7ccecfbfa9e098122ad9ce4342ea4925ede1\": container with ID starting with 6f04baf981f2a08d570dc6de725b7ccecfbfa9e098122ad9ce4342ea4925ede1 not found: ID does not exist" Jan 27 15:57:14 crc kubenswrapper[4713]: I0127 15:57:14.910355 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" path="/var/lib/kubelet/pods/c255f079-b4c8-4c29-8e77-28e56c9a9ecf/volumes" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.836688 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc"] Jan 27 15:57:16 crc kubenswrapper[4713]: E0127 15:57:16.836922 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" containerName="extract-utilities" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.836938 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" containerName="extract-utilities" Jan 27 15:57:16 crc kubenswrapper[4713]: E0127 15:57:16.836951 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" containerName="registry-server" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.836958 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" containerName="registry-server" Jan 27 15:57:16 crc kubenswrapper[4713]: E0127 15:57:16.836967 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" containerName="extract-content" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.836973 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" containerName="extract-content" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.837082 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="c255f079-b4c8-4c29-8e77-28e56c9a9ecf" containerName="registry-server" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.837907 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.839995 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.848287 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc"] Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.893378 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mg4z\" (UniqueName: \"kubernetes.io/projected/1492d427-0396-4def-8421-920adf90b954-kube-api-access-5mg4z\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc\" (UID: \"1492d427-0396-4def-8421-920adf90b954\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.893817 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1492d427-0396-4def-8421-920adf90b954-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc\" (UID: \"1492d427-0396-4def-8421-920adf90b954\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.893857 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1492d427-0396-4def-8421-920adf90b954-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc\" (UID: \"1492d427-0396-4def-8421-920adf90b954\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.995539 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mg4z\" (UniqueName: \"kubernetes.io/projected/1492d427-0396-4def-8421-920adf90b954-kube-api-access-5mg4z\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc\" (UID: \"1492d427-0396-4def-8421-920adf90b954\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.995611 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1492d427-0396-4def-8421-920adf90b954-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc\" (UID: \"1492d427-0396-4def-8421-920adf90b954\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.995642 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1492d427-0396-4def-8421-920adf90b954-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc\" (UID: \"1492d427-0396-4def-8421-920adf90b954\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.996345 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1492d427-0396-4def-8421-920adf90b954-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc\" (UID: \"1492d427-0396-4def-8421-920adf90b954\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" Jan 27 15:57:16 crc kubenswrapper[4713]: I0127 15:57:16.996675 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1492d427-0396-4def-8421-920adf90b954-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc\" (UID: \"1492d427-0396-4def-8421-920adf90b954\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" Jan 27 15:57:17 crc kubenswrapper[4713]: I0127 15:57:17.023391 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mg4z\" (UniqueName: \"kubernetes.io/projected/1492d427-0396-4def-8421-920adf90b954-kube-api-access-5mg4z\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc\" (UID: \"1492d427-0396-4def-8421-920adf90b954\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" Jan 27 15:57:17 crc kubenswrapper[4713]: I0127 15:57:17.156255 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" Jan 27 15:57:17 crc kubenswrapper[4713]: I0127 15:57:17.356815 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc"] Jan 27 15:57:17 crc kubenswrapper[4713]: W0127 15:57:17.363155 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1492d427_0396_4def_8421_920adf90b954.slice/crio-235f883981b599b3adebeae44deb3e57e285d7dec97354cba8afdce4c6a4000f WatchSource:0}: Error finding container 235f883981b599b3adebeae44deb3e57e285d7dec97354cba8afdce4c6a4000f: Status 404 returned error can't find the container with id 235f883981b599b3adebeae44deb3e57e285d7dec97354cba8afdce4c6a4000f Jan 27 15:57:18 crc kubenswrapper[4713]: I0127 15:57:18.069105 4713 generic.go:334] "Generic (PLEG): container finished" podID="1492d427-0396-4def-8421-920adf90b954" containerID="fdbe56b49c9c49076ff46fb7ce017372d2b3353c9829e64de0e45ba72c879cbb" exitCode=0 Jan 27 15:57:18 crc kubenswrapper[4713]: I0127 15:57:18.069163 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" event={"ID":"1492d427-0396-4def-8421-920adf90b954","Type":"ContainerDied","Data":"fdbe56b49c9c49076ff46fb7ce017372d2b3353c9829e64de0e45ba72c879cbb"} Jan 27 15:57:18 crc kubenswrapper[4713]: I0127 15:57:18.069197 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" event={"ID":"1492d427-0396-4def-8421-920adf90b954","Type":"ContainerStarted","Data":"235f883981b599b3adebeae44deb3e57e285d7dec97354cba8afdce4c6a4000f"} Jan 27 15:57:18 crc kubenswrapper[4713]: I0127 15:57:18.070783 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 15:57:20 crc kubenswrapper[4713]: I0127 15:57:20.089863 4713 generic.go:334] "Generic (PLEG): container finished" podID="1492d427-0396-4def-8421-920adf90b954" containerID="56e0a94d6fbf1d9d27930ea98fe61dbd7cb3531842b1e933fc6485d1df47a5c9" exitCode=0 Jan 27 15:57:20 crc kubenswrapper[4713]: I0127 15:57:20.089950 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" event={"ID":"1492d427-0396-4def-8421-920adf90b954","Type":"ContainerDied","Data":"56e0a94d6fbf1d9d27930ea98fe61dbd7cb3531842b1e933fc6485d1df47a5c9"} Jan 27 15:57:21 crc kubenswrapper[4713]: I0127 15:57:21.098523 4713 generic.go:334] "Generic (PLEG): container finished" podID="1492d427-0396-4def-8421-920adf90b954" containerID="82392ade114458d460f65ed823ac205028a5cfd8f8198d79f49defa3d375b339" exitCode=0 Jan 27 15:57:21 crc kubenswrapper[4713]: I0127 15:57:21.098589 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" event={"ID":"1492d427-0396-4def-8421-920adf90b954","Type":"ContainerDied","Data":"82392ade114458d460f65ed823ac205028a5cfd8f8198d79f49defa3d375b339"} Jan 27 15:57:22 crc kubenswrapper[4713]: I0127 15:57:22.332093 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" Jan 27 15:57:22 crc kubenswrapper[4713]: I0127 15:57:22.471546 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1492d427-0396-4def-8421-920adf90b954-bundle\") pod \"1492d427-0396-4def-8421-920adf90b954\" (UID: \"1492d427-0396-4def-8421-920adf90b954\") " Jan 27 15:57:22 crc kubenswrapper[4713]: I0127 15:57:22.471608 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1492d427-0396-4def-8421-920adf90b954-util\") pod \"1492d427-0396-4def-8421-920adf90b954\" (UID: \"1492d427-0396-4def-8421-920adf90b954\") " Jan 27 15:57:22 crc kubenswrapper[4713]: I0127 15:57:22.471633 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5mg4z\" (UniqueName: \"kubernetes.io/projected/1492d427-0396-4def-8421-920adf90b954-kube-api-access-5mg4z\") pod \"1492d427-0396-4def-8421-920adf90b954\" (UID: \"1492d427-0396-4def-8421-920adf90b954\") " Jan 27 15:57:22 crc kubenswrapper[4713]: I0127 15:57:22.476743 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1492d427-0396-4def-8421-920adf90b954-bundle" (OuterVolumeSpecName: "bundle") pod "1492d427-0396-4def-8421-920adf90b954" (UID: "1492d427-0396-4def-8421-920adf90b954"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:57:22 crc kubenswrapper[4713]: I0127 15:57:22.481948 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1492d427-0396-4def-8421-920adf90b954-kube-api-access-5mg4z" (OuterVolumeSpecName: "kube-api-access-5mg4z") pod "1492d427-0396-4def-8421-920adf90b954" (UID: "1492d427-0396-4def-8421-920adf90b954"). InnerVolumeSpecName "kube-api-access-5mg4z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:57:22 crc kubenswrapper[4713]: I0127 15:57:22.493287 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1492d427-0396-4def-8421-920adf90b954-util" (OuterVolumeSpecName: "util") pod "1492d427-0396-4def-8421-920adf90b954" (UID: "1492d427-0396-4def-8421-920adf90b954"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:57:22 crc kubenswrapper[4713]: I0127 15:57:22.573102 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1492d427-0396-4def-8421-920adf90b954-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:22 crc kubenswrapper[4713]: I0127 15:57:22.573585 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1492d427-0396-4def-8421-920adf90b954-util\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:22 crc kubenswrapper[4713]: I0127 15:57:22.573606 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5mg4z\" (UniqueName: \"kubernetes.io/projected/1492d427-0396-4def-8421-920adf90b954-kube-api-access-5mg4z\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.112272 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" event={"ID":"1492d427-0396-4def-8421-920adf90b954","Type":"ContainerDied","Data":"235f883981b599b3adebeae44deb3e57e285d7dec97354cba8afdce4c6a4000f"} Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.112334 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="235f883981b599b3adebeae44deb3e57e285d7dec97354cba8afdce4c6a4000f" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.112394 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.427069 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq"] Jan 27 15:57:23 crc kubenswrapper[4713]: E0127 15:57:23.427438 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1492d427-0396-4def-8421-920adf90b954" containerName="util" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.427479 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1492d427-0396-4def-8421-920adf90b954" containerName="util" Jan 27 15:57:23 crc kubenswrapper[4713]: E0127 15:57:23.427494 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1492d427-0396-4def-8421-920adf90b954" containerName="pull" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.427500 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1492d427-0396-4def-8421-920adf90b954" containerName="pull" Jan 27 15:57:23 crc kubenswrapper[4713]: E0127 15:57:23.427508 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1492d427-0396-4def-8421-920adf90b954" containerName="extract" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.427514 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="1492d427-0396-4def-8421-920adf90b954" containerName="extract" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.427672 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="1492d427-0396-4def-8421-920adf90b954" containerName="extract" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.428873 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.434710 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.443126 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq"] Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.485771 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8774e190-04c5-404d-b6e9-a86e93f0d0af-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq\" (UID: \"8774e190-04c5-404d-b6e9-a86e93f0d0af\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.485889 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8774e190-04c5-404d-b6e9-a86e93f0d0af-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq\" (UID: \"8774e190-04c5-404d-b6e9-a86e93f0d0af\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.485923 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9zdz\" (UniqueName: \"kubernetes.io/projected/8774e190-04c5-404d-b6e9-a86e93f0d0af-kube-api-access-z9zdz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq\" (UID: \"8774e190-04c5-404d-b6e9-a86e93f0d0af\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.587898 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8774e190-04c5-404d-b6e9-a86e93f0d0af-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq\" (UID: \"8774e190-04c5-404d-b6e9-a86e93f0d0af\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.588206 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8774e190-04c5-404d-b6e9-a86e93f0d0af-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq\" (UID: \"8774e190-04c5-404d-b6e9-a86e93f0d0af\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.588286 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9zdz\" (UniqueName: \"kubernetes.io/projected/8774e190-04c5-404d-b6e9-a86e93f0d0af-kube-api-access-z9zdz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq\" (UID: \"8774e190-04c5-404d-b6e9-a86e93f0d0af\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.588484 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8774e190-04c5-404d-b6e9-a86e93f0d0af-bundle\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq\" (UID: \"8774e190-04c5-404d-b6e9-a86e93f0d0af\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.588798 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8774e190-04c5-404d-b6e9-a86e93f0d0af-util\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq\" (UID: \"8774e190-04c5-404d-b6e9-a86e93f0d0af\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.606581 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9zdz\" (UniqueName: \"kubernetes.io/projected/8774e190-04c5-404d-b6e9-a86e93f0d0af-kube-api-access-z9zdz\") pod \"6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq\" (UID: \"8774e190-04c5-404d-b6e9-a86e93f0d0af\") " pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.746391 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" Jan 27 15:57:23 crc kubenswrapper[4713]: I0127 15:57:23.950915 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq"] Jan 27 15:57:23 crc kubenswrapper[4713]: W0127 15:57:23.968845 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8774e190_04c5_404d_b6e9_a86e93f0d0af.slice/crio-32e4b8c7dd53cc6edc725be36985529a8f3d076652b18de2debe3632bd22f8ac WatchSource:0}: Error finding container 32e4b8c7dd53cc6edc725be36985529a8f3d076652b18de2debe3632bd22f8ac: Status 404 returned error can't find the container with id 32e4b8c7dd53cc6edc725be36985529a8f3d076652b18de2debe3632bd22f8ac Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.119171 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" event={"ID":"8774e190-04c5-404d-b6e9-a86e93f0d0af","Type":"ContainerStarted","Data":"ffe3ea5609b8ab42d2b31f3b9345f68ae1aca5fa14ed59cd1c6a487fa2038669"} Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.119237 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" event={"ID":"8774e190-04c5-404d-b6e9-a86e93f0d0af","Type":"ContainerStarted","Data":"32e4b8c7dd53cc6edc725be36985529a8f3d076652b18de2debe3632bd22f8ac"} Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.210509 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw"] Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.211696 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.228263 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw"] Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.298700 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68a17f8c-4beb-4779-9917-302efd887cf8-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw\" (UID: \"68a17f8c-4beb-4779-9917-302efd887cf8\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.299156 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhb4p\" (UniqueName: \"kubernetes.io/projected/68a17f8c-4beb-4779-9917-302efd887cf8-kube-api-access-rhb4p\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw\" (UID: \"68a17f8c-4beb-4779-9917-302efd887cf8\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.299368 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68a17f8c-4beb-4779-9917-302efd887cf8-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw\" (UID: \"68a17f8c-4beb-4779-9917-302efd887cf8\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.400762 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68a17f8c-4beb-4779-9917-302efd887cf8-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw\" (UID: \"68a17f8c-4beb-4779-9917-302efd887cf8\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.400874 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68a17f8c-4beb-4779-9917-302efd887cf8-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw\" (UID: \"68a17f8c-4beb-4779-9917-302efd887cf8\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.400905 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhb4p\" (UniqueName: \"kubernetes.io/projected/68a17f8c-4beb-4779-9917-302efd887cf8-kube-api-access-rhb4p\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw\" (UID: \"68a17f8c-4beb-4779-9917-302efd887cf8\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.401793 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68a17f8c-4beb-4779-9917-302efd887cf8-util\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw\" (UID: \"68a17f8c-4beb-4779-9917-302efd887cf8\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.401990 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68a17f8c-4beb-4779-9917-302efd887cf8-bundle\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw\" (UID: \"68a17f8c-4beb-4779-9917-302efd887cf8\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.422356 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhb4p\" (UniqueName: \"kubernetes.io/projected/68a17f8c-4beb-4779-9917-302efd887cf8-kube-api-access-rhb4p\") pod \"8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw\" (UID: \"68a17f8c-4beb-4779-9917-302efd887cf8\") " pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.574832 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" Jan 27 15:57:24 crc kubenswrapper[4713]: I0127 15:57:24.790527 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw"] Jan 27 15:57:24 crc kubenswrapper[4713]: W0127 15:57:24.794234 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod68a17f8c_4beb_4779_9917_302efd887cf8.slice/crio-cddde8647f3b7fa75edc703b6e197e5f8fb2b1c7ea452cb38e82afc5436adcac WatchSource:0}: Error finding container cddde8647f3b7fa75edc703b6e197e5f8fb2b1c7ea452cb38e82afc5436adcac: Status 404 returned error can't find the container with id cddde8647f3b7fa75edc703b6e197e5f8fb2b1c7ea452cb38e82afc5436adcac Jan 27 15:57:25 crc kubenswrapper[4713]: I0127 15:57:25.127289 4713 generic.go:334] "Generic (PLEG): container finished" podID="8774e190-04c5-404d-b6e9-a86e93f0d0af" containerID="ffe3ea5609b8ab42d2b31f3b9345f68ae1aca5fa14ed59cd1c6a487fa2038669" exitCode=0 Jan 27 15:57:25 crc kubenswrapper[4713]: I0127 15:57:25.127430 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" event={"ID":"8774e190-04c5-404d-b6e9-a86e93f0d0af","Type":"ContainerDied","Data":"ffe3ea5609b8ab42d2b31f3b9345f68ae1aca5fa14ed59cd1c6a487fa2038669"} Jan 27 15:57:25 crc kubenswrapper[4713]: I0127 15:57:25.129405 4713 generic.go:334] "Generic (PLEG): container finished" podID="68a17f8c-4beb-4779-9917-302efd887cf8" containerID="e0537c8b6c19c922970042c2b112aa80587350da292377859bb07e3bb6b7d31d" exitCode=0 Jan 27 15:57:25 crc kubenswrapper[4713]: I0127 15:57:25.129660 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" event={"ID":"68a17f8c-4beb-4779-9917-302efd887cf8","Type":"ContainerDied","Data":"e0537c8b6c19c922970042c2b112aa80587350da292377859bb07e3bb6b7d31d"} Jan 27 15:57:25 crc kubenswrapper[4713]: I0127 15:57:25.129696 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" event={"ID":"68a17f8c-4beb-4779-9917-302efd887cf8","Type":"ContainerStarted","Data":"cddde8647f3b7fa75edc703b6e197e5f8fb2b1c7ea452cb38e82afc5436adcac"} Jan 27 15:57:28 crc kubenswrapper[4713]: I0127 15:57:28.150453 4713 generic.go:334] "Generic (PLEG): container finished" podID="8774e190-04c5-404d-b6e9-a86e93f0d0af" containerID="fcf5b11dc3b48b154847cf30da1e3ae74f78ae30b26d36a1f91642464cb0febc" exitCode=0 Jan 27 15:57:28 crc kubenswrapper[4713]: I0127 15:57:28.150566 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" event={"ID":"8774e190-04c5-404d-b6e9-a86e93f0d0af","Type":"ContainerDied","Data":"fcf5b11dc3b48b154847cf30da1e3ae74f78ae30b26d36a1f91642464cb0febc"} Jan 27 15:57:28 crc kubenswrapper[4713]: I0127 15:57:28.154554 4713 generic.go:334] "Generic (PLEG): container finished" podID="68a17f8c-4beb-4779-9917-302efd887cf8" containerID="deb3e9ff55d1ea30ec17e0c05502115f3dd17814d4adb19d7d0bebe8181466f9" exitCode=0 Jan 27 15:57:28 crc kubenswrapper[4713]: I0127 15:57:28.154617 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" event={"ID":"68a17f8c-4beb-4779-9917-302efd887cf8","Type":"ContainerDied","Data":"deb3e9ff55d1ea30ec17e0c05502115f3dd17814d4adb19d7d0bebe8181466f9"} Jan 27 15:57:29 crc kubenswrapper[4713]: I0127 15:57:29.163050 4713 generic.go:334] "Generic (PLEG): container finished" podID="68a17f8c-4beb-4779-9917-302efd887cf8" containerID="5097e05879ff85b0550f63a832c13e525d3ba2717df4a137040d81c9c86e5315" exitCode=0 Jan 27 15:57:29 crc kubenswrapper[4713]: I0127 15:57:29.163152 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" event={"ID":"68a17f8c-4beb-4779-9917-302efd887cf8","Type":"ContainerDied","Data":"5097e05879ff85b0550f63a832c13e525d3ba2717df4a137040d81c9c86e5315"} Jan 27 15:57:29 crc kubenswrapper[4713]: I0127 15:57:29.165646 4713 generic.go:334] "Generic (PLEG): container finished" podID="8774e190-04c5-404d-b6e9-a86e93f0d0af" containerID="c46cf9d633c9f647c09df8aca391f1b629017d757a87f62b68992c15a873bfe9" exitCode=0 Jan 27 15:57:29 crc kubenswrapper[4713]: I0127 15:57:29.165704 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" event={"ID":"8774e190-04c5-404d-b6e9-a86e93f0d0af","Type":"ContainerDied","Data":"c46cf9d633c9f647c09df8aca391f1b629017d757a87f62b68992c15a873bfe9"} Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.503947 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.505231 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.586440 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9zdz\" (UniqueName: \"kubernetes.io/projected/8774e190-04c5-404d-b6e9-a86e93f0d0af-kube-api-access-z9zdz\") pod \"8774e190-04c5-404d-b6e9-a86e93f0d0af\" (UID: \"8774e190-04c5-404d-b6e9-a86e93f0d0af\") " Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.586497 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68a17f8c-4beb-4779-9917-302efd887cf8-bundle\") pod \"68a17f8c-4beb-4779-9917-302efd887cf8\" (UID: \"68a17f8c-4beb-4779-9917-302efd887cf8\") " Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.586517 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8774e190-04c5-404d-b6e9-a86e93f0d0af-bundle\") pod \"8774e190-04c5-404d-b6e9-a86e93f0d0af\" (UID: \"8774e190-04c5-404d-b6e9-a86e93f0d0af\") " Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.586537 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68a17f8c-4beb-4779-9917-302efd887cf8-util\") pod \"68a17f8c-4beb-4779-9917-302efd887cf8\" (UID: \"68a17f8c-4beb-4779-9917-302efd887cf8\") " Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.586561 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhb4p\" (UniqueName: \"kubernetes.io/projected/68a17f8c-4beb-4779-9917-302efd887cf8-kube-api-access-rhb4p\") pod \"68a17f8c-4beb-4779-9917-302efd887cf8\" (UID: \"68a17f8c-4beb-4779-9917-302efd887cf8\") " Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.586617 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8774e190-04c5-404d-b6e9-a86e93f0d0af-util\") pod \"8774e190-04c5-404d-b6e9-a86e93f0d0af\" (UID: \"8774e190-04c5-404d-b6e9-a86e93f0d0af\") " Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.588316 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8774e190-04c5-404d-b6e9-a86e93f0d0af-bundle" (OuterVolumeSpecName: "bundle") pod "8774e190-04c5-404d-b6e9-a86e93f0d0af" (UID: "8774e190-04c5-404d-b6e9-a86e93f0d0af"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.595824 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68a17f8c-4beb-4779-9917-302efd887cf8-bundle" (OuterVolumeSpecName: "bundle") pod "68a17f8c-4beb-4779-9917-302efd887cf8" (UID: "68a17f8c-4beb-4779-9917-302efd887cf8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.601294 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8774e190-04c5-404d-b6e9-a86e93f0d0af-kube-api-access-z9zdz" (OuterVolumeSpecName: "kube-api-access-z9zdz") pod "8774e190-04c5-404d-b6e9-a86e93f0d0af" (UID: "8774e190-04c5-404d-b6e9-a86e93f0d0af"). InnerVolumeSpecName "kube-api-access-z9zdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.603186 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68a17f8c-4beb-4779-9917-302efd887cf8-kube-api-access-rhb4p" (OuterVolumeSpecName: "kube-api-access-rhb4p") pod "68a17f8c-4beb-4779-9917-302efd887cf8" (UID: "68a17f8c-4beb-4779-9917-302efd887cf8"). InnerVolumeSpecName "kube-api-access-rhb4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.603188 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8774e190-04c5-404d-b6e9-a86e93f0d0af-util" (OuterVolumeSpecName: "util") pod "8774e190-04c5-404d-b6e9-a86e93f0d0af" (UID: "8774e190-04c5-404d-b6e9-a86e93f0d0af"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.687817 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8774e190-04c5-404d-b6e9-a86e93f0d0af-util\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.687858 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z9zdz\" (UniqueName: \"kubernetes.io/projected/8774e190-04c5-404d-b6e9-a86e93f0d0af-kube-api-access-z9zdz\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.687871 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/68a17f8c-4beb-4779-9917-302efd887cf8-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.687883 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8774e190-04c5-404d-b6e9-a86e93f0d0af-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.687892 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhb4p\" (UniqueName: \"kubernetes.io/projected/68a17f8c-4beb-4779-9917-302efd887cf8-kube-api-access-rhb4p\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.889645 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68a17f8c-4beb-4779-9917-302efd887cf8-util" (OuterVolumeSpecName: "util") pod "68a17f8c-4beb-4779-9917-302efd887cf8" (UID: "68a17f8c-4beb-4779-9917-302efd887cf8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:57:30 crc kubenswrapper[4713]: I0127 15:57:30.890423 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/68a17f8c-4beb-4779-9917-302efd887cf8-util\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.180718 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" event={"ID":"8774e190-04c5-404d-b6e9-a86e93f0d0af","Type":"ContainerDied","Data":"32e4b8c7dd53cc6edc725be36985529a8f3d076652b18de2debe3632bd22f8ac"} Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.180820 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32e4b8c7dd53cc6edc725be36985529a8f3d076652b18de2debe3632bd22f8ac" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.180754 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.183796 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" event={"ID":"68a17f8c-4beb-4779-9917-302efd887cf8","Type":"ContainerDied","Data":"cddde8647f3b7fa75edc703b6e197e5f8fb2b1c7ea452cb38e82afc5436adcac"} Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.183858 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cddde8647f3b7fa75edc703b6e197e5f8fb2b1c7ea452cb38e82afc5436adcac" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.183881 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.495449 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq"] Jan 27 15:57:31 crc kubenswrapper[4713]: E0127 15:57:31.495685 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8774e190-04c5-404d-b6e9-a86e93f0d0af" containerName="util" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.495700 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8774e190-04c5-404d-b6e9-a86e93f0d0af" containerName="util" Jan 27 15:57:31 crc kubenswrapper[4713]: E0127 15:57:31.495716 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a17f8c-4beb-4779-9917-302efd887cf8" containerName="pull" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.495722 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a17f8c-4beb-4779-9917-302efd887cf8" containerName="pull" Jan 27 15:57:31 crc kubenswrapper[4713]: E0127 15:57:31.495730 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a17f8c-4beb-4779-9917-302efd887cf8" containerName="extract" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.495736 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a17f8c-4beb-4779-9917-302efd887cf8" containerName="extract" Jan 27 15:57:31 crc kubenswrapper[4713]: E0127 15:57:31.495745 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8774e190-04c5-404d-b6e9-a86e93f0d0af" containerName="pull" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.495776 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8774e190-04c5-404d-b6e9-a86e93f0d0af" containerName="pull" Jan 27 15:57:31 crc kubenswrapper[4713]: E0127 15:57:31.495788 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8774e190-04c5-404d-b6e9-a86e93f0d0af" containerName="extract" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.495794 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="8774e190-04c5-404d-b6e9-a86e93f0d0af" containerName="extract" Jan 27 15:57:31 crc kubenswrapper[4713]: E0127 15:57:31.495809 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68a17f8c-4beb-4779-9917-302efd887cf8" containerName="util" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.495815 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="68a17f8c-4beb-4779-9917-302efd887cf8" containerName="util" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.495907 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="68a17f8c-4beb-4779-9917-302efd887cf8" containerName="extract" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.495919 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="8774e190-04c5-404d-b6e9-a86e93f0d0af" containerName="extract" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.496711 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.499181 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.511310 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq"] Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.599360 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97ef285c-4b33-434e-8252-fca768d794de-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq\" (UID: \"97ef285c-4b33-434e-8252-fca768d794de\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.599486 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97ef285c-4b33-434e-8252-fca768d794de-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq\" (UID: \"97ef285c-4b33-434e-8252-fca768d794de\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.599511 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfz5p\" (UniqueName: \"kubernetes.io/projected/97ef285c-4b33-434e-8252-fca768d794de-kube-api-access-xfz5p\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq\" (UID: \"97ef285c-4b33-434e-8252-fca768d794de\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.700487 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97ef285c-4b33-434e-8252-fca768d794de-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq\" (UID: \"97ef285c-4b33-434e-8252-fca768d794de\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.700798 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfz5p\" (UniqueName: \"kubernetes.io/projected/97ef285c-4b33-434e-8252-fca768d794de-kube-api-access-xfz5p\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq\" (UID: \"97ef285c-4b33-434e-8252-fca768d794de\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.700943 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97ef285c-4b33-434e-8252-fca768d794de-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq\" (UID: \"97ef285c-4b33-434e-8252-fca768d794de\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.701109 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97ef285c-4b33-434e-8252-fca768d794de-util\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq\" (UID: \"97ef285c-4b33-434e-8252-fca768d794de\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.701600 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97ef285c-4b33-434e-8252-fca768d794de-bundle\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq\" (UID: \"97ef285c-4b33-434e-8252-fca768d794de\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.738126 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfz5p\" (UniqueName: \"kubernetes.io/projected/97ef285c-4b33-434e-8252-fca768d794de-kube-api-access-xfz5p\") pod \"1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq\" (UID: \"97ef285c-4b33-434e-8252-fca768d794de\") " pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" Jan 27 15:57:31 crc kubenswrapper[4713]: I0127 15:57:31.811728 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" Jan 27 15:57:32 crc kubenswrapper[4713]: I0127 15:57:32.131683 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq"] Jan 27 15:57:32 crc kubenswrapper[4713]: I0127 15:57:32.192842 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" event={"ID":"97ef285c-4b33-434e-8252-fca768d794de","Type":"ContainerStarted","Data":"c0abfd4cebe248c18f61eb95dceb3c9662be4061829b89afa65da61651d265c5"} Jan 27 15:57:33 crc kubenswrapper[4713]: I0127 15:57:33.221537 4713 generic.go:334] "Generic (PLEG): container finished" podID="97ef285c-4b33-434e-8252-fca768d794de" containerID="9fc2e20a1f0c3b8140d59af37a68e8e47e799c7af7b62a0f61a52d84b9f80b75" exitCode=0 Jan 27 15:57:33 crc kubenswrapper[4713]: I0127 15:57:33.221611 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" event={"ID":"97ef285c-4b33-434e-8252-fca768d794de","Type":"ContainerDied","Data":"9fc2e20a1f0c3b8140d59af37a68e8e47e799c7af7b62a0f61a52d84b9f80b75"} Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.106116 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-gjtdw"] Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.107209 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-gjtdw" Jan 27 15:57:34 crc kubenswrapper[4713]: W0127 15:57:34.111081 4713 reflector.go:561] object-"openshift-operators"/"obo-prometheus-operator-dockercfg-z59cb": failed to list *v1.Secret: secrets "obo-prometheus-operator-dockercfg-z59cb" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-operators": no relationship found between node 'crc' and this object Jan 27 15:57:34 crc kubenswrapper[4713]: E0127 15:57:34.111130 4713 reflector.go:158] "Unhandled Error" err="object-\"openshift-operators\"/\"obo-prometheus-operator-dockercfg-z59cb\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"obo-prometheus-operator-dockercfg-z59cb\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.111364 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.113483 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.125953 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-gjtdw"] Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.133892 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nlm2\" (UniqueName: \"kubernetes.io/projected/09bbc2b2-7f95-4927-978a-96873d1737ad-kube-api-access-2nlm2\") pod \"obo-prometheus-operator-68bc856cb9-gjtdw\" (UID: \"09bbc2b2-7f95-4927-978a-96873d1737ad\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-gjtdw" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.239266 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8"] Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.240120 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.241670 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nlm2\" (UniqueName: \"kubernetes.io/projected/09bbc2b2-7f95-4927-978a-96873d1737ad-kube-api-access-2nlm2\") pod \"obo-prometheus-operator-68bc856cb9-gjtdw\" (UID: \"09bbc2b2-7f95-4927-978a-96873d1737ad\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-gjtdw" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.246585 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-dockercfg-89nqk" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.247284 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.252602 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn"] Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.253890 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.263655 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8"] Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.269068 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nlm2\" (UniqueName: \"kubernetes.io/projected/09bbc2b2-7f95-4927-978a-96873d1737ad-kube-api-access-2nlm2\") pod \"obo-prometheus-operator-68bc856cb9-gjtdw\" (UID: \"09bbc2b2-7f95-4927-978a-96873d1737ad\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-gjtdw" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.294265 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn"] Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.342607 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/341a77a0-e923-4418-a9fd-000eb6f5853d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn\" (UID: \"341a77a0-e923-4418-a9fd-000eb6f5853d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.342670 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd5e4f47-aa65-4231-989b-26060dc08c35-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8\" (UID: \"fd5e4f47-aa65-4231-989b-26060dc08c35\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.342773 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd5e4f47-aa65-4231-989b-26060dc08c35-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8\" (UID: \"fd5e4f47-aa65-4231-989b-26060dc08c35\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.342798 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/341a77a0-e923-4418-a9fd-000eb6f5853d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn\" (UID: \"341a77a0-e923-4418-a9fd-000eb6f5853d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.442575 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-vfldl"] Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.443494 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-vfldl" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.443799 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd5e4f47-aa65-4231-989b-26060dc08c35-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8\" (UID: \"fd5e4f47-aa65-4231-989b-26060dc08c35\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.443859 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/341a77a0-e923-4418-a9fd-000eb6f5853d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn\" (UID: \"341a77a0-e923-4418-a9fd-000eb6f5853d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.443928 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/341a77a0-e923-4418-a9fd-000eb6f5853d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn\" (UID: \"341a77a0-e923-4418-a9fd-000eb6f5853d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.443961 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd5e4f47-aa65-4231-989b-26060dc08c35-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8\" (UID: \"fd5e4f47-aa65-4231-989b-26060dc08c35\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.445859 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-sa-dockercfg-schvc" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.448716 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.448944 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/341a77a0-e923-4418-a9fd-000eb6f5853d-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn\" (UID: \"341a77a0-e923-4418-a9fd-000eb6f5853d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.450438 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fd5e4f47-aa65-4231-989b-26060dc08c35-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8\" (UID: \"fd5e4f47-aa65-4231-989b-26060dc08c35\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.451378 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/341a77a0-e923-4418-a9fd-000eb6f5853d-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn\" (UID: \"341a77a0-e923-4418-a9fd-000eb6f5853d\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.453158 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fd5e4f47-aa65-4231-989b-26060dc08c35-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8\" (UID: \"fd5e4f47-aa65-4231-989b-26060dc08c35\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.467305 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-vfldl"] Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.545352 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a96f9484-508a-4432-9364-af9abac0a60e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-vfldl\" (UID: \"a96f9484-508a-4432-9364-af9abac0a60e\") " pod="openshift-operators/observability-operator-59bdc8b94-vfldl" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.545458 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-px2nx\" (UniqueName: \"kubernetes.io/projected/a96f9484-508a-4432-9364-af9abac0a60e-kube-api-access-px2nx\") pod \"observability-operator-59bdc8b94-vfldl\" (UID: \"a96f9484-508a-4432-9364-af9abac0a60e\") " pod="openshift-operators/observability-operator-59bdc8b94-vfldl" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.571903 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.623957 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.646373 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a96f9484-508a-4432-9364-af9abac0a60e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-vfldl\" (UID: \"a96f9484-508a-4432-9364-af9abac0a60e\") " pod="openshift-operators/observability-operator-59bdc8b94-vfldl" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.646430 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-px2nx\" (UniqueName: \"kubernetes.io/projected/a96f9484-508a-4432-9364-af9abac0a60e-kube-api-access-px2nx\") pod \"observability-operator-59bdc8b94-vfldl\" (UID: \"a96f9484-508a-4432-9364-af9abac0a60e\") " pod="openshift-operators/observability-operator-59bdc8b94-vfldl" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.651878 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/a96f9484-508a-4432-9364-af9abac0a60e-observability-operator-tls\") pod \"observability-operator-59bdc8b94-vfldl\" (UID: \"a96f9484-508a-4432-9364-af9abac0a60e\") " pod="openshift-operators/observability-operator-59bdc8b94-vfldl" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.662755 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-l9flh"] Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.663470 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l9flh" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.669284 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"perses-operator-dockercfg-j55sw" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.677116 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-px2nx\" (UniqueName: \"kubernetes.io/projected/a96f9484-508a-4432-9364-af9abac0a60e-kube-api-access-px2nx\") pod \"observability-operator-59bdc8b94-vfldl\" (UID: \"a96f9484-508a-4432-9364-af9abac0a60e\") " pod="openshift-operators/observability-operator-59bdc8b94-vfldl" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.698690 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-l9flh"] Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.747653 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nj9r\" (UniqueName: \"kubernetes.io/projected/21e51ea9-6185-4ea8-bac3-032447d9da0c-kube-api-access-6nj9r\") pod \"perses-operator-5bf474d74f-l9flh\" (UID: \"21e51ea9-6185-4ea8-bac3-032447d9da0c\") " pod="openshift-operators/perses-operator-5bf474d74f-l9flh" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.747763 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/21e51ea9-6185-4ea8-bac3-032447d9da0c-openshift-service-ca\") pod \"perses-operator-5bf474d74f-l9flh\" (UID: \"21e51ea9-6185-4ea8-bac3-032447d9da0c\") " pod="openshift-operators/perses-operator-5bf474d74f-l9flh" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.797415 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-vfldl" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.850984 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/21e51ea9-6185-4ea8-bac3-032447d9da0c-openshift-service-ca\") pod \"perses-operator-5bf474d74f-l9flh\" (UID: \"21e51ea9-6185-4ea8-bac3-032447d9da0c\") " pod="openshift-operators/perses-operator-5bf474d74f-l9flh" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.851070 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nj9r\" (UniqueName: \"kubernetes.io/projected/21e51ea9-6185-4ea8-bac3-032447d9da0c-kube-api-access-6nj9r\") pod \"perses-operator-5bf474d74f-l9flh\" (UID: \"21e51ea9-6185-4ea8-bac3-032447d9da0c\") " pod="openshift-operators/perses-operator-5bf474d74f-l9flh" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.852351 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/21e51ea9-6185-4ea8-bac3-032447d9da0c-openshift-service-ca\") pod \"perses-operator-5bf474d74f-l9flh\" (UID: \"21e51ea9-6185-4ea8-bac3-032447d9da0c\") " pod="openshift-operators/perses-operator-5bf474d74f-l9flh" Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.888569 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn"] Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.892595 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nj9r\" (UniqueName: \"kubernetes.io/projected/21e51ea9-6185-4ea8-bac3-032447d9da0c-kube-api-access-6nj9r\") pod \"perses-operator-5bf474d74f-l9flh\" (UID: \"21e51ea9-6185-4ea8-bac3-032447d9da0c\") " pod="openshift-operators/perses-operator-5bf474d74f-l9flh" Jan 27 15:57:34 crc kubenswrapper[4713]: W0127 15:57:34.906421 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod341a77a0_e923_4418_a9fd_000eb6f5853d.slice/crio-8d4a2bcf7f7e4911df408b89006986f66d4aade28b11a53199e76d1b81fae34b WatchSource:0}: Error finding container 8d4a2bcf7f7e4911df408b89006986f66d4aade28b11a53199e76d1b81fae34b: Status 404 returned error can't find the container with id 8d4a2bcf7f7e4911df408b89006986f66d4aade28b11a53199e76d1b81fae34b Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.977325 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8"] Jan 27 15:57:34 crc kubenswrapper[4713]: I0127 15:57:34.987548 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-l9flh" Jan 27 15:57:35 crc kubenswrapper[4713]: I0127 15:57:35.091400 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-vfldl"] Jan 27 15:57:35 crc kubenswrapper[4713]: I0127 15:57:35.265109 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-vfldl" event={"ID":"a96f9484-508a-4432-9364-af9abac0a60e","Type":"ContainerStarted","Data":"a1d42f8d6bbddd034307738b1cfc9b7947a16ca378fdf74c3e05d31c974a3644"} Jan 27 15:57:35 crc kubenswrapper[4713]: I0127 15:57:35.274735 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn" event={"ID":"341a77a0-e923-4418-a9fd-000eb6f5853d","Type":"ContainerStarted","Data":"8d4a2bcf7f7e4911df408b89006986f66d4aade28b11a53199e76d1b81fae34b"} Jan 27 15:57:35 crc kubenswrapper[4713]: I0127 15:57:35.281213 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8" event={"ID":"fd5e4f47-aa65-4231-989b-26060dc08c35","Type":"ContainerStarted","Data":"27b106805878f4f21c88829406f95869ebfc47652ce17b97261479369ff5439d"} Jan 27 15:57:35 crc kubenswrapper[4713]: I0127 15:57:35.282514 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-l9flh"] Jan 27 15:57:35 crc kubenswrapper[4713]: I0127 15:57:35.359719 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-dockercfg-z59cb" Jan 27 15:57:35 crc kubenswrapper[4713]: I0127 15:57:35.367333 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-gjtdw" Jan 27 15:57:35 crc kubenswrapper[4713]: I0127 15:57:35.677774 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-gjtdw"] Jan 27 15:57:35 crc kubenswrapper[4713]: W0127 15:57:35.690716 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09bbc2b2_7f95_4927_978a_96873d1737ad.slice/crio-fa916ad261d805dafa8f7fe113fedbbc259547d85c8cc5e1d04a6c24ac27f30e WatchSource:0}: Error finding container fa916ad261d805dafa8f7fe113fedbbc259547d85c8cc5e1d04a6c24ac27f30e: Status 404 returned error can't find the container with id fa916ad261d805dafa8f7fe113fedbbc259547d85c8cc5e1d04a6c24ac27f30e Jan 27 15:57:36 crc kubenswrapper[4713]: I0127 15:57:36.288927 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-l9flh" event={"ID":"21e51ea9-6185-4ea8-bac3-032447d9da0c","Type":"ContainerStarted","Data":"c3c7ca7f6f657dacd86a2c9336f9986902a8ed41577ef4c8a8b1a1449875e2b0"} Jan 27 15:57:36 crc kubenswrapper[4713]: I0127 15:57:36.293132 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-gjtdw" event={"ID":"09bbc2b2-7f95-4927-978a-96873d1737ad","Type":"ContainerStarted","Data":"fa916ad261d805dafa8f7fe113fedbbc259547d85c8cc5e1d04a6c24ac27f30e"} Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.202185 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elastic-operator-6676c4dc45-vwkkm"] Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.203208 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-6676c4dc45-vwkkm" Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.208260 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"kube-root-ca.crt" Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.208545 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-dockercfg-zhxpm" Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.208712 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elastic-operator-service-cert" Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.208880 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"openshift-service-ca.crt" Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.231371 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-6676c4dc45-vwkkm"] Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.255105 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9821e7b7-9770-477e-82a1-8557a61c1680-webhook-cert\") pod \"elastic-operator-6676c4dc45-vwkkm\" (UID: \"9821e7b7-9770-477e-82a1-8557a61c1680\") " pod="service-telemetry/elastic-operator-6676c4dc45-vwkkm" Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.255206 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxx8l\" (UniqueName: \"kubernetes.io/projected/9821e7b7-9770-477e-82a1-8557a61c1680-kube-api-access-hxx8l\") pod \"elastic-operator-6676c4dc45-vwkkm\" (UID: \"9821e7b7-9770-477e-82a1-8557a61c1680\") " pod="service-telemetry/elastic-operator-6676c4dc45-vwkkm" Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.255255 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9821e7b7-9770-477e-82a1-8557a61c1680-apiservice-cert\") pod \"elastic-operator-6676c4dc45-vwkkm\" (UID: \"9821e7b7-9770-477e-82a1-8557a61c1680\") " pod="service-telemetry/elastic-operator-6676c4dc45-vwkkm" Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.356806 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9821e7b7-9770-477e-82a1-8557a61c1680-webhook-cert\") pod \"elastic-operator-6676c4dc45-vwkkm\" (UID: \"9821e7b7-9770-477e-82a1-8557a61c1680\") " pod="service-telemetry/elastic-operator-6676c4dc45-vwkkm" Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.356899 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxx8l\" (UniqueName: \"kubernetes.io/projected/9821e7b7-9770-477e-82a1-8557a61c1680-kube-api-access-hxx8l\") pod \"elastic-operator-6676c4dc45-vwkkm\" (UID: \"9821e7b7-9770-477e-82a1-8557a61c1680\") " pod="service-telemetry/elastic-operator-6676c4dc45-vwkkm" Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.356928 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9821e7b7-9770-477e-82a1-8557a61c1680-apiservice-cert\") pod \"elastic-operator-6676c4dc45-vwkkm\" (UID: \"9821e7b7-9770-477e-82a1-8557a61c1680\") " pod="service-telemetry/elastic-operator-6676c4dc45-vwkkm" Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.367950 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9821e7b7-9770-477e-82a1-8557a61c1680-apiservice-cert\") pod \"elastic-operator-6676c4dc45-vwkkm\" (UID: \"9821e7b7-9770-477e-82a1-8557a61c1680\") " pod="service-telemetry/elastic-operator-6676c4dc45-vwkkm" Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.384318 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxx8l\" (UniqueName: \"kubernetes.io/projected/9821e7b7-9770-477e-82a1-8557a61c1680-kube-api-access-hxx8l\") pod \"elastic-operator-6676c4dc45-vwkkm\" (UID: \"9821e7b7-9770-477e-82a1-8557a61c1680\") " pod="service-telemetry/elastic-operator-6676c4dc45-vwkkm" Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.396128 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9821e7b7-9770-477e-82a1-8557a61c1680-webhook-cert\") pod \"elastic-operator-6676c4dc45-vwkkm\" (UID: \"9821e7b7-9770-477e-82a1-8557a61c1680\") " pod="service-telemetry/elastic-operator-6676c4dc45-vwkkm" Jan 27 15:57:40 crc kubenswrapper[4713]: I0127 15:57:40.532583 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elastic-operator-6676c4dc45-vwkkm" Jan 27 15:57:44 crc kubenswrapper[4713]: I0127 15:57:44.198463 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-cnlwf"] Jan 27 15:57:44 crc kubenswrapper[4713]: I0127 15:57:44.200406 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-cnlwf" Jan 27 15:57:44 crc kubenswrapper[4713]: I0127 15:57:44.204740 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"interconnect-operator-dockercfg-n8ks2" Jan 27 15:57:44 crc kubenswrapper[4713]: I0127 15:57:44.205569 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-cnlwf"] Jan 27 15:57:44 crc kubenswrapper[4713]: I0127 15:57:44.229552 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdzpw\" (UniqueName: \"kubernetes.io/projected/4d754b5c-22af-4ec7-bef6-a2a363b29a67-kube-api-access-sdzpw\") pod \"interconnect-operator-5bb49f789d-cnlwf\" (UID: \"4d754b5c-22af-4ec7-bef6-a2a363b29a67\") " pod="service-telemetry/interconnect-operator-5bb49f789d-cnlwf" Jan 27 15:57:44 crc kubenswrapper[4713]: I0127 15:57:44.330945 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdzpw\" (UniqueName: \"kubernetes.io/projected/4d754b5c-22af-4ec7-bef6-a2a363b29a67-kube-api-access-sdzpw\") pod \"interconnect-operator-5bb49f789d-cnlwf\" (UID: \"4d754b5c-22af-4ec7-bef6-a2a363b29a67\") " pod="service-telemetry/interconnect-operator-5bb49f789d-cnlwf" Jan 27 15:57:44 crc kubenswrapper[4713]: I0127 15:57:44.358497 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdzpw\" (UniqueName: \"kubernetes.io/projected/4d754b5c-22af-4ec7-bef6-a2a363b29a67-kube-api-access-sdzpw\") pod \"interconnect-operator-5bb49f789d-cnlwf\" (UID: \"4d754b5c-22af-4ec7-bef6-a2a363b29a67\") " pod="service-telemetry/interconnect-operator-5bb49f789d-cnlwf" Jan 27 15:57:44 crc kubenswrapper[4713]: I0127 15:57:44.524110 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/interconnect-operator-5bb49f789d-cnlwf" Jan 27 15:57:52 crc kubenswrapper[4713]: E0127 15:57:52.176081 4713 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea" Jan 27 15:57:52 crc kubenswrapper[4713]: E0127 15:57:52.177005 4713 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:prometheus-operator-admission-webhook,Image:registry.redhat.io/cluster-observability-operator/obo-prometheus-operator-admission-webhook-rhel9@sha256:42ebc3571195d8c41fd01b8d08e98fe2cc12c1caabea251aecb4442d8eade4ea,Command:[],Args:[--web.enable-tls=true --web.cert-file=/tmp/k8s-webhook-server/serving-certs/tls.crt --web.key-file=/tmp/k8s-webhook-server/serving-certs/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:https,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_CONDITION_NAME,Value:cluster-observability-operator.v1.3.1,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{209715200 0} {} BinarySI},},Requests:ResourceList{cpu: {{50 -3} {} 50m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:apiservice-cert,ReadOnly:false,MountPath:/apiserver.local.config/certificates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/tmp/k8s-webhook-server/serving-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8_openshift-operators(fd5e4f47-aa65-4231-989b-26060dc08c35): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 27 15:57:52 crc kubenswrapper[4713]: E0127 15:57:52.178183 4713 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"prometheus-operator-admission-webhook\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8" podUID="fd5e4f47-aa65-4231-989b-26060dc08c35" Jan 27 15:57:52 crc kubenswrapper[4713]: I0127 15:57:52.551945 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/interconnect-operator-5bb49f789d-cnlwf"] Jan 27 15:57:52 crc kubenswrapper[4713]: W0127 15:57:52.582636 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d754b5c_22af_4ec7_bef6_a2a363b29a67.slice/crio-41c61a9b00174d72868b3df8ea105309cb8e34665a13845f578f43aec98b31cc WatchSource:0}: Error finding container 41c61a9b00174d72868b3df8ea105309cb8e34665a13845f578f43aec98b31cc: Status 404 returned error can't find the container with id 41c61a9b00174d72868b3df8ea105309cb8e34665a13845f578f43aec98b31cc Jan 27 15:57:52 crc kubenswrapper[4713]: I0127 15:57:52.716778 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elastic-operator-6676c4dc45-vwkkm"] Jan 27 15:57:52 crc kubenswrapper[4713]: W0127 15:57:52.753443 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9821e7b7_9770_477e_82a1_8557a61c1680.slice/crio-e54cafaad20ec378b604481f8f5a1e864fbfab9f9a3f874a570685b738acd2e9 WatchSource:0}: Error finding container e54cafaad20ec378b604481f8f5a1e864fbfab9f9a3f874a570685b738acd2e9: Status 404 returned error can't find the container with id e54cafaad20ec378b604481f8f5a1e864fbfab9f9a3f874a570685b738acd2e9 Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.443929 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-cnlwf" event={"ID":"4d754b5c-22af-4ec7-bef6-a2a363b29a67","Type":"ContainerStarted","Data":"41c61a9b00174d72868b3df8ea105309cb8e34665a13845f578f43aec98b31cc"} Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.446527 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-vfldl" event={"ID":"a96f9484-508a-4432-9364-af9abac0a60e","Type":"ContainerStarted","Data":"bbcf2e29e11901875c277a7a8d6bb537227563de9b6f942f206ef9401fc5f65a"} Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.446807 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-vfldl" Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.449782 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn" event={"ID":"341a77a0-e923-4418-a9fd-000eb6f5853d","Type":"ContainerStarted","Data":"929a67636fbe59b396a1fc6392b724f9cb4c04b59fdc0c1e31e9cee6a19a53aa"} Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.452733 4713 generic.go:334] "Generic (PLEG): container finished" podID="97ef285c-4b33-434e-8252-fca768d794de" containerID="954a765d250becdb7279ba7d4a75429fecb19b03cbb99f0d1166fa8aaa310781" exitCode=0 Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.452782 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" event={"ID":"97ef285c-4b33-434e-8252-fca768d794de","Type":"ContainerDied","Data":"954a765d250becdb7279ba7d4a75429fecb19b03cbb99f0d1166fa8aaa310781"} Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.456609 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-l9flh" event={"ID":"21e51ea9-6185-4ea8-bac3-032447d9da0c","Type":"ContainerStarted","Data":"0d36524dd9d852521076d7b6ab3a77c5834ee010da13016c2500b17000a05b78"} Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.456830 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-l9flh" Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.459544 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8" event={"ID":"fd5e4f47-aa65-4231-989b-26060dc08c35","Type":"ContainerStarted","Data":"eb988ebbff44d2c16db236895d6c235e37abff7c43be04a6af6c234ff07bf280"} Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.460516 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-vfldl" Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.461240 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-6676c4dc45-vwkkm" event={"ID":"9821e7b7-9770-477e-82a1-8557a61c1680","Type":"ContainerStarted","Data":"e54cafaad20ec378b604481f8f5a1e864fbfab9f9a3f874a570685b738acd2e9"} Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.462636 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-gjtdw" event={"ID":"09bbc2b2-7f95-4927-978a-96873d1737ad","Type":"ContainerStarted","Data":"d63026a7b1414c01d3fa62c44ab772701f77ca7c44246f7d5b7669d949dcc5e6"} Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.484435 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-vfldl" podStartSLOduration=2.262349721 podStartE2EDuration="19.484411593s" podCreationTimestamp="2026-01-27 15:57:34 +0000 UTC" firstStartedPulling="2026-01-27 15:57:35.109608388 +0000 UTC m=+622.887818326" lastFinishedPulling="2026-01-27 15:57:52.33167026 +0000 UTC m=+640.109880198" observedRunningTime="2026-01-27 15:57:53.479751929 +0000 UTC m=+641.257961867" watchObservedRunningTime="2026-01-27 15:57:53.484411593 +0000 UTC m=+641.262621531" Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.512489 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-gjtdw" podStartSLOduration=2.979327881 podStartE2EDuration="19.512463471s" podCreationTimestamp="2026-01-27 15:57:34 +0000 UTC" firstStartedPulling="2026-01-27 15:57:35.698916091 +0000 UTC m=+623.477126029" lastFinishedPulling="2026-01-27 15:57:52.232051681 +0000 UTC m=+640.010261619" observedRunningTime="2026-01-27 15:57:53.508526187 +0000 UTC m=+641.286736125" watchObservedRunningTime="2026-01-27 15:57:53.512463471 +0000 UTC m=+641.290673399" Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.540431 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8" podStartSLOduration=-9223372017.31437 podStartE2EDuration="19.540407276s" podCreationTimestamp="2026-01-27 15:57:34 +0000 UTC" firstStartedPulling="2026-01-27 15:57:34.995635235 +0000 UTC m=+622.773845163" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:57:53.539454378 +0000 UTC m=+641.317664326" watchObservedRunningTime="2026-01-27 15:57:53.540407276 +0000 UTC m=+641.318617214" Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.607331 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn" podStartSLOduration=2.284442007 podStartE2EDuration="19.607309543s" podCreationTimestamp="2026-01-27 15:57:34 +0000 UTC" firstStartedPulling="2026-01-27 15:57:34.90866774 +0000 UTC m=+622.686877678" lastFinishedPulling="2026-01-27 15:57:52.231535276 +0000 UTC m=+640.009745214" observedRunningTime="2026-01-27 15:57:53.569162274 +0000 UTC m=+641.347372212" watchObservedRunningTime="2026-01-27 15:57:53.607309543 +0000 UTC m=+641.385519481" Jan 27 15:57:53 crc kubenswrapper[4713]: I0127 15:57:53.636689 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-l9flh" podStartSLOduration=2.699913444 podStartE2EDuration="19.636654988s" podCreationTimestamp="2026-01-27 15:57:34 +0000 UTC" firstStartedPulling="2026-01-27 15:57:35.309625749 +0000 UTC m=+623.087835687" lastFinishedPulling="2026-01-27 15:57:52.246367293 +0000 UTC m=+640.024577231" observedRunningTime="2026-01-27 15:57:53.633090485 +0000 UTC m=+641.411300443" watchObservedRunningTime="2026-01-27 15:57:53.636654988 +0000 UTC m=+641.414864926" Jan 27 15:57:54 crc kubenswrapper[4713]: I0127 15:57:54.480139 4713 generic.go:334] "Generic (PLEG): container finished" podID="97ef285c-4b33-434e-8252-fca768d794de" containerID="bbe1f615eac579308f3655eeb8d9acbe8eb03e84b739b017b7e93ca06ba13e00" exitCode=0 Jan 27 15:57:54 crc kubenswrapper[4713]: I0127 15:57:54.480266 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" event={"ID":"97ef285c-4b33-434e-8252-fca768d794de","Type":"ContainerDied","Data":"bbe1f615eac579308f3655eeb8d9acbe8eb03e84b739b017b7e93ca06ba13e00"} Jan 27 15:57:56 crc kubenswrapper[4713]: I0127 15:57:56.649915 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" Jan 27 15:57:56 crc kubenswrapper[4713]: I0127 15:57:56.733546 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97ef285c-4b33-434e-8252-fca768d794de-bundle\") pod \"97ef285c-4b33-434e-8252-fca768d794de\" (UID: \"97ef285c-4b33-434e-8252-fca768d794de\") " Jan 27 15:57:56 crc kubenswrapper[4713]: I0127 15:57:56.733883 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfz5p\" (UniqueName: \"kubernetes.io/projected/97ef285c-4b33-434e-8252-fca768d794de-kube-api-access-xfz5p\") pod \"97ef285c-4b33-434e-8252-fca768d794de\" (UID: \"97ef285c-4b33-434e-8252-fca768d794de\") " Jan 27 15:57:56 crc kubenswrapper[4713]: I0127 15:57:56.734000 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97ef285c-4b33-434e-8252-fca768d794de-util\") pod \"97ef285c-4b33-434e-8252-fca768d794de\" (UID: \"97ef285c-4b33-434e-8252-fca768d794de\") " Jan 27 15:57:56 crc kubenswrapper[4713]: I0127 15:57:56.734650 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97ef285c-4b33-434e-8252-fca768d794de-bundle" (OuterVolumeSpecName: "bundle") pod "97ef285c-4b33-434e-8252-fca768d794de" (UID: "97ef285c-4b33-434e-8252-fca768d794de"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:57:56 crc kubenswrapper[4713]: I0127 15:57:56.743237 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97ef285c-4b33-434e-8252-fca768d794de-kube-api-access-xfz5p" (OuterVolumeSpecName: "kube-api-access-xfz5p") pod "97ef285c-4b33-434e-8252-fca768d794de" (UID: "97ef285c-4b33-434e-8252-fca768d794de"). InnerVolumeSpecName "kube-api-access-xfz5p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:57:56 crc kubenswrapper[4713]: I0127 15:57:56.768330 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97ef285c-4b33-434e-8252-fca768d794de-util" (OuterVolumeSpecName: "util") pod "97ef285c-4b33-434e-8252-fca768d794de" (UID: "97ef285c-4b33-434e-8252-fca768d794de"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:57:56 crc kubenswrapper[4713]: I0127 15:57:56.835391 4713 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/97ef285c-4b33-434e-8252-fca768d794de-bundle\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:56 crc kubenswrapper[4713]: I0127 15:57:56.835429 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfz5p\" (UniqueName: \"kubernetes.io/projected/97ef285c-4b33-434e-8252-fca768d794de-kube-api-access-xfz5p\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:56 crc kubenswrapper[4713]: I0127 15:57:56.835443 4713 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/97ef285c-4b33-434e-8252-fca768d794de-util\") on node \"crc\" DevicePath \"\"" Jan 27 15:57:57 crc kubenswrapper[4713]: I0127 15:57:57.508995 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" event={"ID":"97ef285c-4b33-434e-8252-fca768d794de","Type":"ContainerDied","Data":"c0abfd4cebe248c18f61eb95dceb3c9662be4061829b89afa65da61651d265c5"} Jan 27 15:57:57 crc kubenswrapper[4713]: I0127 15:57:57.509434 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c0abfd4cebe248c18f61eb95dceb3c9662be4061829b89afa65da61651d265c5" Jan 27 15:57:57 crc kubenswrapper[4713]: I0127 15:57:57.509359 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq" Jan 27 15:57:57 crc kubenswrapper[4713]: I0127 15:57:57.512423 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elastic-operator-6676c4dc45-vwkkm" event={"ID":"9821e7b7-9770-477e-82a1-8557a61c1680","Type":"ContainerStarted","Data":"f41593acd859f415dd219192a1cee80ba9c097f873cd811d523aa8f88e44c205"} Jan 27 15:57:57 crc kubenswrapper[4713]: I0127 15:57:57.544880 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elastic-operator-6676c4dc45-vwkkm" podStartSLOduration=13.382522058 podStartE2EDuration="17.544853225s" podCreationTimestamp="2026-01-27 15:57:40 +0000 UTC" firstStartedPulling="2026-01-27 15:57:52.759908675 +0000 UTC m=+640.538118613" lastFinishedPulling="2026-01-27 15:57:56.922239842 +0000 UTC m=+644.700449780" observedRunningTime="2026-01-27 15:57:57.544023471 +0000 UTC m=+645.322233409" watchObservedRunningTime="2026-01-27 15:57:57.544853225 +0000 UTC m=+645.323063163" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.381659 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 15:57:58 crc kubenswrapper[4713]: E0127 15:57:58.381921 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ef285c-4b33-434e-8252-fca768d794de" containerName="extract" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.381938 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ef285c-4b33-434e-8252-fca768d794de" containerName="extract" Jan 27 15:57:58 crc kubenswrapper[4713]: E0127 15:57:58.381977 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ef285c-4b33-434e-8252-fca768d794de" containerName="util" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.381984 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ef285c-4b33-434e-8252-fca768d794de" containerName="util" Jan 27 15:57:58 crc kubenswrapper[4713]: E0127 15:57:58.381997 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97ef285c-4b33-434e-8252-fca768d794de" containerName="pull" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.382004 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="97ef285c-4b33-434e-8252-fca768d794de" containerName="pull" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.382181 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="97ef285c-4b33-434e-8252-fca768d794de" containerName="extract" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.382997 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.385957 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-transport-certs" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.386304 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-remote-ca" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.386818 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-default-es-config" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.386953 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-unicast-hosts" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.387509 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"elasticsearch-es-scripts" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.387969 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"default-dockercfg-jh6x2" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.388026 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-http-certs-internal" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.388115 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-internal-users" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.388278 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"elasticsearch-es-xpack-file-realm" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460134 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460192 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460255 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460287 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460314 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460353 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460375 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460403 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460522 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460598 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/8177832b-7570-42d2-8b38-2764cff6dc6d-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460626 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460678 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460708 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460736 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.460803 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.477558 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.561701 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.561778 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.561811 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.561842 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.561869 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.561906 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.561932 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.561999 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.562099 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.562130 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/8177832b-7570-42d2-8b38-2764cff6dc6d-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.562169 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.562195 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.562223 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.562261 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.562297 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.562492 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config-local\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-elasticsearch-config-local\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.562894 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-bin-local\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-elasticsearch-bin-local\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.563841 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-data\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elasticsearch-data\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.565505 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elasticsearch-logs\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elasticsearch-logs\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.566498 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-unicast-hosts\" (UniqueName: \"kubernetes.io/configmap/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-unicast-hosts\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.567346 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-scripts\" (UniqueName: \"kubernetes.io/configmap/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-scripts\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.569221 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-http-certificates\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-http-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.569842 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-xpack-file-realm\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-xpack-file-realm\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.570290 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-plugins-local\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-elasticsearch-plugins-local\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.575310 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"downward-api\" (UniqueName: \"kubernetes.io/downward-api/8177832b-7570-42d2-8b38-2764cff6dc6d-downward-api\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.578875 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-probe-user\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-probe-user\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.579234 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-remote-certificate-authorities\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-remote-certificate-authorities\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.581148 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp-volume\" (UniqueName: \"kubernetes.io/empty-dir/8177832b-7570-42d2-8b38-2764cff6dc6d-tmp-volume\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.581978 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-elasticsearch-config\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-elasticsearch-config\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.586562 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"elastic-internal-transport-certificates\" (UniqueName: \"kubernetes.io/secret/8177832b-7570-42d2-8b38-2764cff6dc6d-elastic-internal-transport-certificates\") pod \"elasticsearch-es-default-0\" (UID: \"8177832b-7570-42d2-8b38-2764cff6dc6d\") " pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:57:58 crc kubenswrapper[4713]: I0127 15:57:58.701928 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:58:02 crc kubenswrapper[4713]: I0127 15:58:02.753327 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 15:58:03 crc kubenswrapper[4713]: I0127 15:58:03.556693 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/interconnect-operator-5bb49f789d-cnlwf" event={"ID":"4d754b5c-22af-4ec7-bef6-a2a363b29a67","Type":"ContainerStarted","Data":"d6cd03df54306a67d1a3ea5ce130384b90f60748adea8e2b434f441973143bea"} Jan 27 15:58:03 crc kubenswrapper[4713]: I0127 15:58:03.558544 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"8177832b-7570-42d2-8b38-2764cff6dc6d","Type":"ContainerStarted","Data":"f3305c4b070b2b423476d21ae23866be617962a6e78c36c0f43904450f997484"} Jan 27 15:58:03 crc kubenswrapper[4713]: I0127 15:58:03.572223 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/interconnect-operator-5bb49f789d-cnlwf" podStartSLOduration=9.660064503 podStartE2EDuration="19.572195099s" podCreationTimestamp="2026-01-27 15:57:44 +0000 UTC" firstStartedPulling="2026-01-27 15:57:52.585018678 +0000 UTC m=+640.363228616" lastFinishedPulling="2026-01-27 15:58:02.497149174 +0000 UTC m=+650.275359212" observedRunningTime="2026-01-27 15:58:03.572160558 +0000 UTC m=+651.350370506" watchObservedRunningTime="2026-01-27 15:58:03.572195099 +0000 UTC m=+651.350405037" Jan 27 15:58:04 crc kubenswrapper[4713]: I0127 15:58:04.990257 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-l9flh" Jan 27 15:58:13 crc kubenswrapper[4713]: I0127 15:58:13.270945 4713 scope.go:117] "RemoveContainer" containerID="d8a49a95a57ace26bb9f5f2b76d3662c8a88064a53cdf39542dc93ff1ff98e56" Jan 27 15:58:18 crc kubenswrapper[4713]: I0127 15:58:18.669115 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"8177832b-7570-42d2-8b38-2764cff6dc6d","Type":"ContainerStarted","Data":"2ecd13731212e34422f04f1c9b8ac5b52915402bf9877de366ef2362228a3dfa"} Jan 27 15:58:18 crc kubenswrapper[4713]: I0127 15:58:18.878404 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 15:58:18 crc kubenswrapper[4713]: I0127 15:58:18.938478 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/elasticsearch-es-default-0"] Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.051806 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-wqxn9"] Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.055578 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-wqxn9" Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.068298 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.068510 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.068310 4713 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager-operator"/"cert-manager-operator-controller-manager-dockercfg-957r4" Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.091644 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-wqxn9"] Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.173853 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a0ab383-293f-4dde-b3f4-ac41015872ec-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-wqxn9\" (UID: \"0a0ab383-293f-4dde-b3f4-ac41015872ec\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-wqxn9" Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.173949 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bzr6\" (UniqueName: \"kubernetes.io/projected/0a0ab383-293f-4dde-b3f4-ac41015872ec-kube-api-access-9bzr6\") pod \"cert-manager-operator-controller-manager-5446d6888b-wqxn9\" (UID: \"0a0ab383-293f-4dde-b3f4-ac41015872ec\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-wqxn9" Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.274957 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9bzr6\" (UniqueName: \"kubernetes.io/projected/0a0ab383-293f-4dde-b3f4-ac41015872ec-kube-api-access-9bzr6\") pod \"cert-manager-operator-controller-manager-5446d6888b-wqxn9\" (UID: \"0a0ab383-293f-4dde-b3f4-ac41015872ec\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-wqxn9" Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.275098 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a0ab383-293f-4dde-b3f4-ac41015872ec-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-wqxn9\" (UID: \"0a0ab383-293f-4dde-b3f4-ac41015872ec\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-wqxn9" Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.275649 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0a0ab383-293f-4dde-b3f4-ac41015872ec-tmp\") pod \"cert-manager-operator-controller-manager-5446d6888b-wqxn9\" (UID: \"0a0ab383-293f-4dde-b3f4-ac41015872ec\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-wqxn9" Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.297414 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bzr6\" (UniqueName: \"kubernetes.io/projected/0a0ab383-293f-4dde-b3f4-ac41015872ec-kube-api-access-9bzr6\") pod \"cert-manager-operator-controller-manager-5446d6888b-wqxn9\" (UID: \"0a0ab383-293f-4dde-b3f4-ac41015872ec\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-wqxn9" Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.393773 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-wqxn9" Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.628309 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-wqxn9"] Jan 27 15:58:19 crc kubenswrapper[4713]: W0127 15:58:19.639102 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a0ab383_293f_4dde_b3f4_ac41015872ec.slice/crio-b02422cc22b2ccbf8fae7434aa99aed5fd76031e4f3c4c20dc279e8b40852265 WatchSource:0}: Error finding container b02422cc22b2ccbf8fae7434aa99aed5fd76031e4f3c4c20dc279e8b40852265: Status 404 returned error can't find the container with id b02422cc22b2ccbf8fae7434aa99aed5fd76031e4f3c4c20dc279e8b40852265 Jan 27 15:58:19 crc kubenswrapper[4713]: I0127 15:58:19.678741 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-wqxn9" event={"ID":"0a0ab383-293f-4dde-b3f4-ac41015872ec","Type":"ContainerStarted","Data":"b02422cc22b2ccbf8fae7434aa99aed5fd76031e4f3c4c20dc279e8b40852265"} Jan 27 15:58:20 crc kubenswrapper[4713]: I0127 15:58:20.690134 4713 generic.go:334] "Generic (PLEG): container finished" podID="8177832b-7570-42d2-8b38-2764cff6dc6d" containerID="2ecd13731212e34422f04f1c9b8ac5b52915402bf9877de366ef2362228a3dfa" exitCode=0 Jan 27 15:58:20 crc kubenswrapper[4713]: I0127 15:58:20.690701 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"8177832b-7570-42d2-8b38-2764cff6dc6d","Type":"ContainerDied","Data":"2ecd13731212e34422f04f1c9b8ac5b52915402bf9877de366ef2362228a3dfa"} Jan 27 15:58:24 crc kubenswrapper[4713]: I0127 15:58:24.714534 4713 generic.go:334] "Generic (PLEG): container finished" podID="8177832b-7570-42d2-8b38-2764cff6dc6d" containerID="56412b50ceb61cb501185aada26f5f18ece5c24d9723adf67253c3f40d62ad91" exitCode=0 Jan 27 15:58:24 crc kubenswrapper[4713]: I0127 15:58:24.714620 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"8177832b-7570-42d2-8b38-2764cff6dc6d","Type":"ContainerDied","Data":"56412b50ceb61cb501185aada26f5f18ece5c24d9723adf67253c3f40d62ad91"} Jan 27 15:58:25 crc kubenswrapper[4713]: I0127 15:58:25.724766 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/elasticsearch-es-default-0" event={"ID":"8177832b-7570-42d2-8b38-2764cff6dc6d","Type":"ContainerStarted","Data":"feff25574e314f79f24be010518b027a07e14c0ef2dfbf7d01fb81c36cfb0b99"} Jan 27 15:58:25 crc kubenswrapper[4713]: I0127 15:58:25.725180 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:58:25 crc kubenswrapper[4713]: I0127 15:58:25.759242 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="service-telemetry/elasticsearch-es-default-0" podStartSLOduration=12.888019026 podStartE2EDuration="27.759216335s" podCreationTimestamp="2026-01-27 15:57:58 +0000 UTC" firstStartedPulling="2026-01-27 15:58:02.764064112 +0000 UTC m=+650.542274060" lastFinishedPulling="2026-01-27 15:58:17.635261431 +0000 UTC m=+665.413471369" observedRunningTime="2026-01-27 15:58:25.756066544 +0000 UTC m=+673.534276492" watchObservedRunningTime="2026-01-27 15:58:25.759216335 +0000 UTC m=+673.537426283" Jan 27 15:58:31 crc kubenswrapper[4713]: I0127 15:58:31.765938 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-wqxn9" event={"ID":"0a0ab383-293f-4dde-b3f4-ac41015872ec","Type":"ContainerStarted","Data":"c1ebbc6b587dc6cee6446c208fdc9d66aebbf8535ea9a54d2e52596a75bf6ac0"} Jan 27 15:58:31 crc kubenswrapper[4713]: I0127 15:58:31.787191 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-5446d6888b-wqxn9" podStartSLOduration=1.100070528 podStartE2EDuration="12.787171935s" podCreationTimestamp="2026-01-27 15:58:19 +0000 UTC" firstStartedPulling="2026-01-27 15:58:19.642158726 +0000 UTC m=+667.420368664" lastFinishedPulling="2026-01-27 15:58:31.329260133 +0000 UTC m=+679.107470071" observedRunningTime="2026-01-27 15:58:31.784305333 +0000 UTC m=+679.562515261" watchObservedRunningTime="2026-01-27 15:58:31.787171935 +0000 UTC m=+679.565381873" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.114352 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.115960 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.117919 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-global-ca" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.118017 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-5v6b8" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.118390 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-sys-config" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.119479 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-1-ca" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.133334 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.177943 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.177988 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/187d6b27-9093-4df4-b5cd-a89ed60f7d29-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.178011 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/187d6b27-9093-4df4-b5cd-a89ed60f7d29-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.178048 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/187d6b27-9093-4df4-b5cd-a89ed60f7d29-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.178104 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/187d6b27-9093-4df4-b5cd-a89ed60f7d29-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.178226 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.178254 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.178293 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skkxf\" (UniqueName: \"kubernetes.io/projected/187d6b27-9093-4df4-b5cd-a89ed60f7d29-kube-api-access-skkxf\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.178318 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.178377 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.178413 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.178438 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.280239 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.280329 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skkxf\" (UniqueName: \"kubernetes.io/projected/187d6b27-9093-4df4-b5cd-a89ed60f7d29-kube-api-access-skkxf\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.280361 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.280398 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.280465 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.280501 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.280536 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.280564 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/187d6b27-9093-4df4-b5cd-a89ed60f7d29-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.280590 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/187d6b27-9093-4df4-b5cd-a89ed60f7d29-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.280619 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/187d6b27-9093-4df4-b5cd-a89ed60f7d29-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.280645 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/187d6b27-9093-4df4-b5cd-a89ed60f7d29-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.280676 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.280913 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-container-storage-run\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.280956 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-container-storage-root\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.281024 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/187d6b27-9093-4df4-b5cd-a89ed60f7d29-buildcachedir\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.281168 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-buildworkdir\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.281295 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-blob-cache\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.281611 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/187d6b27-9093-4df4-b5cd-a89ed60f7d29-node-pullsecrets\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.281742 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-system-configs\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.281752 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-proxy-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.282304 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-ca-bundles\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.288853 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/187d6b27-9093-4df4-b5cd-a89ed60f7d29-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.297234 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/187d6b27-9093-4df4-b5cd-a89ed60f7d29-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.301744 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skkxf\" (UniqueName: \"kubernetes.io/projected/187d6b27-9093-4df4-b5cd-a89ed60f7d29-kube-api-access-skkxf\") pod \"service-telemetry-operator-1-build\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.353444 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-llgg9"] Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.354279 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-llgg9" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.356877 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.357299 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.365588 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-llgg9"] Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.432647 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.483290 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81a892d6-5f04-4cbd-b78b-2f1ca484b3a3-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-llgg9\" (UID: \"81a892d6-5f04-4cbd-b78b-2f1ca484b3a3\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-llgg9" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.483384 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vxftp\" (UniqueName: \"kubernetes.io/projected/81a892d6-5f04-4cbd-b78b-2f1ca484b3a3-kube-api-access-vxftp\") pod \"cert-manager-webhook-f4fb5df64-llgg9\" (UID: \"81a892d6-5f04-4cbd-b78b-2f1ca484b3a3\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-llgg9" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.588679 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vxftp\" (UniqueName: \"kubernetes.io/projected/81a892d6-5f04-4cbd-b78b-2f1ca484b3a3-kube-api-access-vxftp\") pod \"cert-manager-webhook-f4fb5df64-llgg9\" (UID: \"81a892d6-5f04-4cbd-b78b-2f1ca484b3a3\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-llgg9" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.589133 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81a892d6-5f04-4cbd-b78b-2f1ca484b3a3-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-llgg9\" (UID: \"81a892d6-5f04-4cbd-b78b-2f1ca484b3a3\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-llgg9" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.613650 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vxftp\" (UniqueName: \"kubernetes.io/projected/81a892d6-5f04-4cbd-b78b-2f1ca484b3a3-kube-api-access-vxftp\") pod \"cert-manager-webhook-f4fb5df64-llgg9\" (UID: \"81a892d6-5f04-4cbd-b78b-2f1ca484b3a3\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-llgg9" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.626736 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/81a892d6-5f04-4cbd-b78b-2f1ca484b3a3-bound-sa-token\") pod \"cert-manager-webhook-f4fb5df64-llgg9\" (UID: \"81a892d6-5f04-4cbd-b78b-2f1ca484b3a3\") " pod="cert-manager/cert-manager-webhook-f4fb5df64-llgg9" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.674891 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-f4fb5df64-llgg9" Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.981917 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 15:58:33 crc kubenswrapper[4713]: I0127 15:58:33.985286 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-f4fb5df64-llgg9"] Jan 27 15:58:33 crc kubenswrapper[4713]: W0127 15:58:33.986349 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod81a892d6_5f04_4cbd_b78b_2f1ca484b3a3.slice/crio-569079f64ba5961134f13105e2f170b6536e389f2de12147d8ddd7f64e3940ea WatchSource:0}: Error finding container 569079f64ba5961134f13105e2f170b6536e389f2de12147d8ddd7f64e3940ea: Status 404 returned error can't find the container with id 569079f64ba5961134f13105e2f170b6536e389f2de12147d8ddd7f64e3940ea Jan 27 15:58:33 crc kubenswrapper[4713]: W0127 15:58:33.992622 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod187d6b27_9093_4df4_b5cd_a89ed60f7d29.slice/crio-3fc5b3bbc9e0e4228a2ae6c6ec9a8dd1d7c3233000662bc9c477f40a2898fdca WatchSource:0}: Error finding container 3fc5b3bbc9e0e4228a2ae6c6ec9a8dd1d7c3233000662bc9c477f40a2898fdca: Status 404 returned error can't find the container with id 3fc5b3bbc9e0e4228a2ae6c6ec9a8dd1d7c3233000662bc9c477f40a2898fdca Jan 27 15:58:34 crc kubenswrapper[4713]: I0127 15:58:34.788861 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-llgg9" event={"ID":"81a892d6-5f04-4cbd-b78b-2f1ca484b3a3","Type":"ContainerStarted","Data":"569079f64ba5961134f13105e2f170b6536e389f2de12147d8ddd7f64e3940ea"} Jan 27 15:58:34 crc kubenswrapper[4713]: I0127 15:58:34.790155 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"187d6b27-9093-4df4-b5cd-a89ed60f7d29","Type":"ContainerStarted","Data":"3fc5b3bbc9e0e4228a2ae6c6ec9a8dd1d7c3233000662bc9c477f40a2898fdca"} Jan 27 15:58:37 crc kubenswrapper[4713]: I0127 15:58:37.723428 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-nx2kd"] Jan 27 15:58:37 crc kubenswrapper[4713]: I0127 15:58:37.724872 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-nx2kd" Jan 27 15:58:37 crc kubenswrapper[4713]: I0127 15:58:37.727992 4713 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-4b4r5" Jan 27 15:58:37 crc kubenswrapper[4713]: I0127 15:58:37.735994 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-nx2kd"] Jan 27 15:58:37 crc kubenswrapper[4713]: I0127 15:58:37.861427 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8fqst\" (UniqueName: \"kubernetes.io/projected/d3b82c14-e19f-4811-ad1b-c381f5629a67-kube-api-access-8fqst\") pod \"cert-manager-cainjector-855d9ccff4-nx2kd\" (UID: \"d3b82c14-e19f-4811-ad1b-c381f5629a67\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-nx2kd" Jan 27 15:58:37 crc kubenswrapper[4713]: I0127 15:58:37.861655 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3b82c14-e19f-4811-ad1b-c381f5629a67-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-nx2kd\" (UID: \"d3b82c14-e19f-4811-ad1b-c381f5629a67\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-nx2kd" Jan 27 15:58:37 crc kubenswrapper[4713]: I0127 15:58:37.963384 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3b82c14-e19f-4811-ad1b-c381f5629a67-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-nx2kd\" (UID: \"d3b82c14-e19f-4811-ad1b-c381f5629a67\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-nx2kd" Jan 27 15:58:37 crc kubenswrapper[4713]: I0127 15:58:37.963449 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8fqst\" (UniqueName: \"kubernetes.io/projected/d3b82c14-e19f-4811-ad1b-c381f5629a67-kube-api-access-8fqst\") pod \"cert-manager-cainjector-855d9ccff4-nx2kd\" (UID: \"d3b82c14-e19f-4811-ad1b-c381f5629a67\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-nx2kd" Jan 27 15:58:37 crc kubenswrapper[4713]: I0127 15:58:37.991875 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8fqst\" (UniqueName: \"kubernetes.io/projected/d3b82c14-e19f-4811-ad1b-c381f5629a67-kube-api-access-8fqst\") pod \"cert-manager-cainjector-855d9ccff4-nx2kd\" (UID: \"d3b82c14-e19f-4811-ad1b-c381f5629a67\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-nx2kd" Jan 27 15:58:38 crc kubenswrapper[4713]: I0127 15:58:38.001614 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d3b82c14-e19f-4811-ad1b-c381f5629a67-bound-sa-token\") pod \"cert-manager-cainjector-855d9ccff4-nx2kd\" (UID: \"d3b82c14-e19f-4811-ad1b-c381f5629a67\") " pod="cert-manager/cert-manager-cainjector-855d9ccff4-nx2kd" Jan 27 15:58:38 crc kubenswrapper[4713]: I0127 15:58:38.081907 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-855d9ccff4-nx2kd" Jan 27 15:58:38 crc kubenswrapper[4713]: I0127 15:58:38.557986 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-855d9ccff4-nx2kd"] Jan 27 15:58:38 crc kubenswrapper[4713]: I0127 15:58:38.815741 4713 prober.go:107] "Probe failed" probeType="Readiness" pod="service-telemetry/elasticsearch-es-default-0" podUID="8177832b-7570-42d2-8b38-2764cff6dc6d" containerName="elasticsearch" probeResult="failure" output=< Jan 27 15:58:38 crc kubenswrapper[4713]: {"timestamp": "2026-01-27T15:58:38+00:00", "message": "readiness probe failed", "curl_rc": "7"} Jan 27 15:58:38 crc kubenswrapper[4713]: > Jan 27 15:58:38 crc kubenswrapper[4713]: I0127 15:58:38.831330 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-nx2kd" event={"ID":"d3b82c14-e19f-4811-ad1b-c381f5629a67","Type":"ContainerStarted","Data":"9fe1b1470d5832e73b5cef066615c86980acc9f9ae6a8a6f41780b9b24661eef"} Jan 27 15:58:43 crc kubenswrapper[4713]: I0127 15:58:43.568579 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 15:58:44 crc kubenswrapper[4713]: I0127 15:58:44.407532 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="service-telemetry/elasticsearch-es-default-0" Jan 27 15:58:44 crc kubenswrapper[4713]: I0127 15:58:44.880237 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-86cb77c54b-jcngs"] Jan 27 15:58:44 crc kubenswrapper[4713]: I0127 15:58:44.884684 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-jcngs" Jan 27 15:58:44 crc kubenswrapper[4713]: I0127 15:58:44.890141 4713 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-trlfc" Jan 27 15:58:44 crc kubenswrapper[4713]: I0127 15:58:44.930381 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-jcngs"] Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.005489 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/496fd2b2-f168-4003-8dff-5bec14a42142-bound-sa-token\") pod \"cert-manager-86cb77c54b-jcngs\" (UID: \"496fd2b2-f168-4003-8dff-5bec14a42142\") " pod="cert-manager/cert-manager-86cb77c54b-jcngs" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.005553 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cqb5\" (UniqueName: \"kubernetes.io/projected/496fd2b2-f168-4003-8dff-5bec14a42142-kube-api-access-9cqb5\") pod \"cert-manager-86cb77c54b-jcngs\" (UID: \"496fd2b2-f168-4003-8dff-5bec14a42142\") " pod="cert-manager/cert-manager-86cb77c54b-jcngs" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.107336 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/496fd2b2-f168-4003-8dff-5bec14a42142-bound-sa-token\") pod \"cert-manager-86cb77c54b-jcngs\" (UID: \"496fd2b2-f168-4003-8dff-5bec14a42142\") " pod="cert-manager/cert-manager-86cb77c54b-jcngs" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.107407 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cqb5\" (UniqueName: \"kubernetes.io/projected/496fd2b2-f168-4003-8dff-5bec14a42142-kube-api-access-9cqb5\") pod \"cert-manager-86cb77c54b-jcngs\" (UID: \"496fd2b2-f168-4003-8dff-5bec14a42142\") " pod="cert-manager/cert-manager-86cb77c54b-jcngs" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.142219 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/496fd2b2-f168-4003-8dff-5bec14a42142-bound-sa-token\") pod \"cert-manager-86cb77c54b-jcngs\" (UID: \"496fd2b2-f168-4003-8dff-5bec14a42142\") " pod="cert-manager/cert-manager-86cb77c54b-jcngs" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.156314 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cqb5\" (UniqueName: \"kubernetes.io/projected/496fd2b2-f168-4003-8dff-5bec14a42142-kube-api-access-9cqb5\") pod \"cert-manager-86cb77c54b-jcngs\" (UID: \"496fd2b2-f168-4003-8dff-5bec14a42142\") " pod="cert-manager/cert-manager-86cb77c54b-jcngs" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.190657 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.192523 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.195358 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-global-ca" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.195401 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-sys-config" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.205352 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-2-ca" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.211209 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.238584 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-86cb77c54b-jcngs" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.308832 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.308889 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9f3cc06d-cc80-421c-b74d-3951888c169d-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.308914 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/9f3cc06d-cc80-421c-b74d-3951888c169d-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.308939 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.308959 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.309120 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.309283 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.309366 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/9f3cc06d-cc80-421c-b74d-3951888c169d-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.309422 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.309507 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.309656 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9f3cc06d-cc80-421c-b74d-3951888c169d-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.309689 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hvhzj\" (UniqueName: \"kubernetes.io/projected/9f3cc06d-cc80-421c-b74d-3951888c169d-kube-api-access-hvhzj\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.410670 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9f3cc06d-cc80-421c-b74d-3951888c169d-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.410733 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hvhzj\" (UniqueName: \"kubernetes.io/projected/9f3cc06d-cc80-421c-b74d-3951888c169d-kube-api-access-hvhzj\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.410760 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.410783 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9f3cc06d-cc80-421c-b74d-3951888c169d-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.410800 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/9f3cc06d-cc80-421c-b74d-3951888c169d-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.410818 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.410836 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.410861 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.410880 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.410889 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9f3cc06d-cc80-421c-b74d-3951888c169d-node-pullsecrets\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.410922 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9f3cc06d-cc80-421c-b74d-3951888c169d-buildcachedir\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.410902 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/9f3cc06d-cc80-421c-b74d-3951888c169d-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.411176 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.411248 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.411490 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-container-storage-run\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.411526 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-system-configs\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.411645 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-container-storage-root\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.411778 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-proxy-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.411784 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-buildworkdir\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.411923 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-build-blob-cache\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.412244 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-ca-bundles\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.424495 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/9f3cc06d-cc80-421c-b74d-3951888c169d-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.424811 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/9f3cc06d-cc80-421c-b74d-3951888c169d-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.426829 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hvhzj\" (UniqueName: \"kubernetes.io/projected/9f3cc06d-cc80-421c-b74d-3951888c169d-kube-api-access-hvhzj\") pod \"service-telemetry-operator-2-build\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:45 crc kubenswrapper[4713]: I0127 15:58:45.512561 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:49 crc kubenswrapper[4713]: I0127 15:58:49.179990 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 27 15:58:49 crc kubenswrapper[4713]: I0127 15:58:49.241239 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-86cb77c54b-jcngs"] Jan 27 15:58:49 crc kubenswrapper[4713]: I0127 15:58:49.955685 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9f3cc06d-cc80-421c-b74d-3951888c169d","Type":"ContainerStarted","Data":"5de6909c6248a86d654e7ff5b9deecb378cc589be01b6d26adf66ca3f946c973"} Jan 27 15:58:49 crc kubenswrapper[4713]: I0127 15:58:49.957197 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-jcngs" event={"ID":"496fd2b2-f168-4003-8dff-5bec14a42142","Type":"ContainerStarted","Data":"0b5da267bfa038310ba340989e9220696bd5e761f2f2abd08733afe23e842899"} Jan 27 15:58:50 crc kubenswrapper[4713]: I0127 15:58:50.965490 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-f4fb5df64-llgg9" event={"ID":"81a892d6-5f04-4cbd-b78b-2f1ca484b3a3","Type":"ContainerStarted","Data":"d30656a055d23f3dab7abea5fd5d75887f4279e2f6cbe5b5bcd8cf83b64a1490"} Jan 27 15:58:51 crc kubenswrapper[4713]: I0127 15:58:51.975443 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"187d6b27-9093-4df4-b5cd-a89ed60f7d29","Type":"ContainerStarted","Data":"eda3552a5eb70961fc8634b01bb7e9c5d7a77167b6575fd5c3c964db51fdc21e"} Jan 27 15:58:51 crc kubenswrapper[4713]: I0127 15:58:51.975611 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-1-build" podUID="187d6b27-9093-4df4-b5cd-a89ed60f7d29" containerName="manage-dockerfile" containerID="cri-o://eda3552a5eb70961fc8634b01bb7e9c5d7a77167b6575fd5c3c964db51fdc21e" gracePeriod=30 Jan 27 15:58:51 crc kubenswrapper[4713]: I0127 15:58:51.978732 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9f3cc06d-cc80-421c-b74d-3951888c169d","Type":"ContainerStarted","Data":"1a75af27c3b91a118be1e2626607e06ce6051e6f4f188448d86702963dbde622"} Jan 27 15:58:51 crc kubenswrapper[4713]: I0127 15:58:51.980751 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-855d9ccff4-nx2kd" event={"ID":"d3b82c14-e19f-4811-ad1b-c381f5629a67","Type":"ContainerStarted","Data":"b1714019d5971427ba0059928617b6127363454091e289cf59e1cb7ccd436009"} Jan 27 15:58:51 crc kubenswrapper[4713]: I0127 15:58:51.982787 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-86cb77c54b-jcngs" event={"ID":"496fd2b2-f168-4003-8dff-5bec14a42142","Type":"ContainerStarted","Data":"77f1ad41646d8ba142e84cf488ed1b61618a92c4750756c1b2b5ab3abca0c66b"} Jan 27 15:58:51 crc kubenswrapper[4713]: I0127 15:58:51.982936 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-f4fb5df64-llgg9" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.053944 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-855d9ccff4-nx2kd" podStartSLOduration=4.69143931 podStartE2EDuration="15.053900444s" podCreationTimestamp="2026-01-27 15:58:37 +0000 UTC" firstStartedPulling="2026-01-27 15:58:38.586538787 +0000 UTC m=+686.364748725" lastFinishedPulling="2026-01-27 15:58:48.948999921 +0000 UTC m=+696.727209859" observedRunningTime="2026-01-27 15:58:52.037999645 +0000 UTC m=+699.816209583" watchObservedRunningTime="2026-01-27 15:58:52.053900444 +0000 UTC m=+699.832110382" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.069402 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-86cb77c54b-jcngs" podStartSLOduration=8.069370429 podStartE2EDuration="8.069370429s" podCreationTimestamp="2026-01-27 15:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-27 15:58:52.064251981 +0000 UTC m=+699.842461919" watchObservedRunningTime="2026-01-27 15:58:52.069370429 +0000 UTC m=+699.847580377" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.126636 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-f4fb5df64-llgg9" podStartSLOduration=4.1352004430000004 podStartE2EDuration="19.12661552s" podCreationTimestamp="2026-01-27 15:58:33 +0000 UTC" firstStartedPulling="2026-01-27 15:58:33.992243494 +0000 UTC m=+681.770453432" lastFinishedPulling="2026-01-27 15:58:48.983658571 +0000 UTC m=+696.761868509" observedRunningTime="2026-01-27 15:58:52.123750187 +0000 UTC m=+699.901960125" watchObservedRunningTime="2026-01-27 15:58:52.12661552 +0000 UTC m=+699.904825458" Jan 27 15:58:52 crc kubenswrapper[4713]: E0127 15:58:52.142294 4713 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=9140154571024215360, SKID=, AKID=10:1E:84:9E:B5:F4:AB:1F:47:8D:17:2E:A5:84:11:11:5E:8E:8C:8B failed: x509: certificate signed by unknown authority" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.432799 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_187d6b27-9093-4df4-b5cd-a89ed60f7d29/manage-dockerfile/0.log" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.433215 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.528479 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/187d6b27-9093-4df4-b5cd-a89ed60f7d29-builder-dockercfg-5v6b8-push\") pod \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.528635 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/187d6b27-9093-4df4-b5cd-a89ed60f7d29-node-pullsecrets\") pod \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.528646 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/187d6b27-9093-4df4-b5cd-a89ed60f7d29-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "187d6b27-9093-4df4-b5cd-a89ed60f7d29" (UID: "187d6b27-9093-4df4-b5cd-a89ed60f7d29"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.528674 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-skkxf\" (UniqueName: \"kubernetes.io/projected/187d6b27-9093-4df4-b5cd-a89ed60f7d29-kube-api-access-skkxf\") pod \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.528717 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-blob-cache\") pod \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.528770 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-container-storage-root\") pod \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.528798 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-system-configs\") pod \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.528818 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-container-storage-run\") pod \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.528837 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/187d6b27-9093-4df4-b5cd-a89ed60f7d29-builder-dockercfg-5v6b8-pull\") pod \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.528859 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-proxy-ca-bundles\") pod \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.528897 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-ca-bundles\") pod \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.528934 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-buildworkdir\") pod \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.528978 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/187d6b27-9093-4df4-b5cd-a89ed60f7d29-buildcachedir\") pod \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\" (UID: \"187d6b27-9093-4df4-b5cd-a89ed60f7d29\") " Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.529290 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/187d6b27-9093-4df4-b5cd-a89ed60f7d29-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.529324 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/187d6b27-9093-4df4-b5cd-a89ed60f7d29-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "187d6b27-9093-4df4-b5cd-a89ed60f7d29" (UID: "187d6b27-9093-4df4-b5cd-a89ed60f7d29"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.529420 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "187d6b27-9093-4df4-b5cd-a89ed60f7d29" (UID: "187d6b27-9093-4df4-b5cd-a89ed60f7d29"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.529621 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "187d6b27-9093-4df4-b5cd-a89ed60f7d29" (UID: "187d6b27-9093-4df4-b5cd-a89ed60f7d29"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.529574 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "187d6b27-9093-4df4-b5cd-a89ed60f7d29" (UID: "187d6b27-9093-4df4-b5cd-a89ed60f7d29"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.529684 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "187d6b27-9093-4df4-b5cd-a89ed60f7d29" (UID: "187d6b27-9093-4df4-b5cd-a89ed60f7d29"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.529777 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "187d6b27-9093-4df4-b5cd-a89ed60f7d29" (UID: "187d6b27-9093-4df4-b5cd-a89ed60f7d29"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.529792 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "187d6b27-9093-4df4-b5cd-a89ed60f7d29" (UID: "187d6b27-9093-4df4-b5cd-a89ed60f7d29"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.529873 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "187d6b27-9093-4df4-b5cd-a89ed60f7d29" (UID: "187d6b27-9093-4df4-b5cd-a89ed60f7d29"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.534693 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187d6b27-9093-4df4-b5cd-a89ed60f7d29-builder-dockercfg-5v6b8-pull" (OuterVolumeSpecName: "builder-dockercfg-5v6b8-pull") pod "187d6b27-9093-4df4-b5cd-a89ed60f7d29" (UID: "187d6b27-9093-4df4-b5cd-a89ed60f7d29"). InnerVolumeSpecName "builder-dockercfg-5v6b8-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.535149 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/187d6b27-9093-4df4-b5cd-a89ed60f7d29-builder-dockercfg-5v6b8-push" (OuterVolumeSpecName: "builder-dockercfg-5v6b8-push") pod "187d6b27-9093-4df4-b5cd-a89ed60f7d29" (UID: "187d6b27-9093-4df4-b5cd-a89ed60f7d29"). InnerVolumeSpecName "builder-dockercfg-5v6b8-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.536987 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187d6b27-9093-4df4-b5cd-a89ed60f7d29-kube-api-access-skkxf" (OuterVolumeSpecName: "kube-api-access-skkxf") pod "187d6b27-9093-4df4-b5cd-a89ed60f7d29" (UID: "187d6b27-9093-4df4-b5cd-a89ed60f7d29"). InnerVolumeSpecName "kube-api-access-skkxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.630382 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.630410 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/187d6b27-9093-4df4-b5cd-a89ed60f7d29-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.630420 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/187d6b27-9093-4df4-b5cd-a89ed60f7d29-builder-dockercfg-5v6b8-push\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.630432 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-skkxf\" (UniqueName: \"kubernetes.io/projected/187d6b27-9093-4df4-b5cd-a89ed60f7d29-kube-api-access-skkxf\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.630441 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.630449 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.630457 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.630465 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/187d6b27-9093-4df4-b5cd-a89ed60f7d29-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.630472 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/187d6b27-9093-4df4-b5cd-a89ed60f7d29-builder-dockercfg-5v6b8-pull\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.630482 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.630490 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/187d6b27-9093-4df4-b5cd-a89ed60f7d29-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.991243 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-1-build_187d6b27-9093-4df4-b5cd-a89ed60f7d29/manage-dockerfile/0.log" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.991334 4713 generic.go:334] "Generic (PLEG): container finished" podID="187d6b27-9093-4df4-b5cd-a89ed60f7d29" containerID="eda3552a5eb70961fc8634b01bb7e9c5d7a77167b6575fd5c3c964db51fdc21e" exitCode=1 Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.991426 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"187d6b27-9093-4df4-b5cd-a89ed60f7d29","Type":"ContainerDied","Data":"eda3552a5eb70961fc8634b01bb7e9c5d7a77167b6575fd5c3c964db51fdc21e"} Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.991493 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-1-build" event={"ID":"187d6b27-9093-4df4-b5cd-a89ed60f7d29","Type":"ContainerDied","Data":"3fc5b3bbc9e0e4228a2ae6c6ec9a8dd1d7c3233000662bc9c477f40a2898fdca"} Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.991519 4713 scope.go:117] "RemoveContainer" containerID="eda3552a5eb70961fc8634b01bb7e9c5d7a77167b6575fd5c3c964db51fdc21e" Jan 27 15:58:52 crc kubenswrapper[4713]: I0127 15:58:52.991726 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-1-build" Jan 27 15:58:53 crc kubenswrapper[4713]: I0127 15:58:53.016004 4713 scope.go:117] "RemoveContainer" containerID="eda3552a5eb70961fc8634b01bb7e9c5d7a77167b6575fd5c3c964db51fdc21e" Jan 27 15:58:53 crc kubenswrapper[4713]: E0127 15:58:53.016730 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eda3552a5eb70961fc8634b01bb7e9c5d7a77167b6575fd5c3c964db51fdc21e\": container with ID starting with eda3552a5eb70961fc8634b01bb7e9c5d7a77167b6575fd5c3c964db51fdc21e not found: ID does not exist" containerID="eda3552a5eb70961fc8634b01bb7e9c5d7a77167b6575fd5c3c964db51fdc21e" Jan 27 15:58:53 crc kubenswrapper[4713]: I0127 15:58:53.016768 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eda3552a5eb70961fc8634b01bb7e9c5d7a77167b6575fd5c3c964db51fdc21e"} err="failed to get container status \"eda3552a5eb70961fc8634b01bb7e9c5d7a77167b6575fd5c3c964db51fdc21e\": rpc error: code = NotFound desc = could not find container \"eda3552a5eb70961fc8634b01bb7e9c5d7a77167b6575fd5c3c964db51fdc21e\": container with ID starting with eda3552a5eb70961fc8634b01bb7e9c5d7a77167b6575fd5c3c964db51fdc21e not found: ID does not exist" Jan 27 15:58:53 crc kubenswrapper[4713]: I0127 15:58:53.017514 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 15:58:53 crc kubenswrapper[4713]: I0127 15:58:53.025255 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-1-build"] Jan 27 15:58:53 crc kubenswrapper[4713]: I0127 15:58:53.172941 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.002498 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-2-build" podUID="9f3cc06d-cc80-421c-b74d-3951888c169d" containerName="git-clone" containerID="cri-o://1a75af27c3b91a118be1e2626607e06ce6051e6f4f188448d86702963dbde622" gracePeriod=30 Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.427286 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_9f3cc06d-cc80-421c-b74d-3951888c169d/git-clone/0.log" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.427365 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.560383 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/9f3cc06d-cc80-421c-b74d-3951888c169d-builder-dockercfg-5v6b8-pull\") pod \"9f3cc06d-cc80-421c-b74d-3951888c169d\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.560450 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-container-storage-root\") pod \"9f3cc06d-cc80-421c-b74d-3951888c169d\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.560488 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-buildworkdir\") pod \"9f3cc06d-cc80-421c-b74d-3951888c169d\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.560510 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-system-configs\") pod \"9f3cc06d-cc80-421c-b74d-3951888c169d\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.560529 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9f3cc06d-cc80-421c-b74d-3951888c169d-node-pullsecrets\") pod \"9f3cc06d-cc80-421c-b74d-3951888c169d\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.560561 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-build-blob-cache\") pod \"9f3cc06d-cc80-421c-b74d-3951888c169d\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.560579 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-container-storage-run\") pod \"9f3cc06d-cc80-421c-b74d-3951888c169d\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.560621 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/9f3cc06d-cc80-421c-b74d-3951888c169d-builder-dockercfg-5v6b8-push\") pod \"9f3cc06d-cc80-421c-b74d-3951888c169d\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.560671 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9f3cc06d-cc80-421c-b74d-3951888c169d-buildcachedir\") pod \"9f3cc06d-cc80-421c-b74d-3951888c169d\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.560699 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-proxy-ca-bundles\") pod \"9f3cc06d-cc80-421c-b74d-3951888c169d\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.560732 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hvhzj\" (UniqueName: \"kubernetes.io/projected/9f3cc06d-cc80-421c-b74d-3951888c169d-kube-api-access-hvhzj\") pod \"9f3cc06d-cc80-421c-b74d-3951888c169d\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.560768 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-ca-bundles\") pod \"9f3cc06d-cc80-421c-b74d-3951888c169d\" (UID: \"9f3cc06d-cc80-421c-b74d-3951888c169d\") " Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.560983 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "9f3cc06d-cc80-421c-b74d-3951888c169d" (UID: "9f3cc06d-cc80-421c-b74d-3951888c169d"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.561325 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "9f3cc06d-cc80-421c-b74d-3951888c169d" (UID: "9f3cc06d-cc80-421c-b74d-3951888c169d"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.561523 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "9f3cc06d-cc80-421c-b74d-3951888c169d" (UID: "9f3cc06d-cc80-421c-b74d-3951888c169d"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.561656 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f3cc06d-cc80-421c-b74d-3951888c169d-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "9f3cc06d-cc80-421c-b74d-3951888c169d" (UID: "9f3cc06d-cc80-421c-b74d-3951888c169d"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.561659 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "9f3cc06d-cc80-421c-b74d-3951888c169d" (UID: "9f3cc06d-cc80-421c-b74d-3951888c169d"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.561748 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9f3cc06d-cc80-421c-b74d-3951888c169d-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "9f3cc06d-cc80-421c-b74d-3951888c169d" (UID: "9f3cc06d-cc80-421c-b74d-3951888c169d"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.562076 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "9f3cc06d-cc80-421c-b74d-3951888c169d" (UID: "9f3cc06d-cc80-421c-b74d-3951888c169d"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.562100 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "9f3cc06d-cc80-421c-b74d-3951888c169d" (UID: "9f3cc06d-cc80-421c-b74d-3951888c169d"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.562198 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "9f3cc06d-cc80-421c-b74d-3951888c169d" (UID: "9f3cc06d-cc80-421c-b74d-3951888c169d"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.566234 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f3cc06d-cc80-421c-b74d-3951888c169d-builder-dockercfg-5v6b8-push" (OuterVolumeSpecName: "builder-dockercfg-5v6b8-push") pod "9f3cc06d-cc80-421c-b74d-3951888c169d" (UID: "9f3cc06d-cc80-421c-b74d-3951888c169d"). InnerVolumeSpecName "builder-dockercfg-5v6b8-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.570136 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f3cc06d-cc80-421c-b74d-3951888c169d-builder-dockercfg-5v6b8-pull" (OuterVolumeSpecName: "builder-dockercfg-5v6b8-pull") pod "9f3cc06d-cc80-421c-b74d-3951888c169d" (UID: "9f3cc06d-cc80-421c-b74d-3951888c169d"). InnerVolumeSpecName "builder-dockercfg-5v6b8-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.574920 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f3cc06d-cc80-421c-b74d-3951888c169d-kube-api-access-hvhzj" (OuterVolumeSpecName: "kube-api-access-hvhzj") pod "9f3cc06d-cc80-421c-b74d-3951888c169d" (UID: "9f3cc06d-cc80-421c-b74d-3951888c169d"). InnerVolumeSpecName "kube-api-access-hvhzj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.662268 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.662826 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hvhzj\" (UniqueName: \"kubernetes.io/projected/9f3cc06d-cc80-421c-b74d-3951888c169d-kube-api-access-hvhzj\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.662838 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.662847 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/9f3cc06d-cc80-421c-b74d-3951888c169d-builder-dockercfg-5v6b8-pull\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.662858 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.662867 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.662876 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/9f3cc06d-cc80-421c-b74d-3951888c169d-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.662890 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/9f3cc06d-cc80-421c-b74d-3951888c169d-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.662910 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.662924 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/9f3cc06d-cc80-421c-b74d-3951888c169d-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.662933 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/9f3cc06d-cc80-421c-b74d-3951888c169d-builder-dockercfg-5v6b8-push\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.662946 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/9f3cc06d-cc80-421c-b74d-3951888c169d-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 15:58:54 crc kubenswrapper[4713]: I0127 15:58:54.917058 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="187d6b27-9093-4df4-b5cd-a89ed60f7d29" path="/var/lib/kubelet/pods/187d6b27-9093-4df4-b5cd-a89ed60f7d29/volumes" Jan 27 15:58:55 crc kubenswrapper[4713]: I0127 15:58:55.012599 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-2-build_9f3cc06d-cc80-421c-b74d-3951888c169d/git-clone/0.log" Jan 27 15:58:55 crc kubenswrapper[4713]: I0127 15:58:55.012671 4713 generic.go:334] "Generic (PLEG): container finished" podID="9f3cc06d-cc80-421c-b74d-3951888c169d" containerID="1a75af27c3b91a118be1e2626607e06ce6051e6f4f188448d86702963dbde622" exitCode=1 Jan 27 15:58:55 crc kubenswrapper[4713]: I0127 15:58:55.012742 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9f3cc06d-cc80-421c-b74d-3951888c169d","Type":"ContainerDied","Data":"1a75af27c3b91a118be1e2626607e06ce6051e6f4f188448d86702963dbde622"} Jan 27 15:58:55 crc kubenswrapper[4713]: I0127 15:58:55.012779 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-2-build" event={"ID":"9f3cc06d-cc80-421c-b74d-3951888c169d","Type":"ContainerDied","Data":"5de6909c6248a86d654e7ff5b9deecb378cc589be01b6d26adf66ca3f946c973"} Jan 27 15:58:55 crc kubenswrapper[4713]: I0127 15:58:55.012795 4713 scope.go:117] "RemoveContainer" containerID="1a75af27c3b91a118be1e2626607e06ce6051e6f4f188448d86702963dbde622" Jan 27 15:58:55 crc kubenswrapper[4713]: I0127 15:58:55.013015 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-2-build" Jan 27 15:58:55 crc kubenswrapper[4713]: I0127 15:58:55.040328 4713 scope.go:117] "RemoveContainer" containerID="1a75af27c3b91a118be1e2626607e06ce6051e6f4f188448d86702963dbde622" Jan 27 15:58:55 crc kubenswrapper[4713]: E0127 15:58:55.043020 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a75af27c3b91a118be1e2626607e06ce6051e6f4f188448d86702963dbde622\": container with ID starting with 1a75af27c3b91a118be1e2626607e06ce6051e6f4f188448d86702963dbde622 not found: ID does not exist" containerID="1a75af27c3b91a118be1e2626607e06ce6051e6f4f188448d86702963dbde622" Jan 27 15:58:55 crc kubenswrapper[4713]: I0127 15:58:55.043129 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a75af27c3b91a118be1e2626607e06ce6051e6f4f188448d86702963dbde622"} err="failed to get container status \"1a75af27c3b91a118be1e2626607e06ce6051e6f4f188448d86702963dbde622\": rpc error: code = NotFound desc = could not find container \"1a75af27c3b91a118be1e2626607e06ce6051e6f4f188448d86702963dbde622\": container with ID starting with 1a75af27c3b91a118be1e2626607e06ce6051e6f4f188448d86702963dbde622 not found: ID does not exist" Jan 27 15:58:55 crc kubenswrapper[4713]: I0127 15:58:55.050712 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 27 15:58:55 crc kubenswrapper[4713]: I0127 15:58:55.055684 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-2-build"] Jan 27 15:58:56 crc kubenswrapper[4713]: I0127 15:58:56.909788 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f3cc06d-cc80-421c-b74d-3951888c169d" path="/var/lib/kubelet/pods/9f3cc06d-cc80-421c-b74d-3951888c169d/volumes" Jan 27 15:58:58 crc kubenswrapper[4713]: I0127 15:58:58.679791 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-f4fb5df64-llgg9" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.606648 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 27 15:59:04 crc kubenswrapper[4713]: E0127 15:59:04.607608 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f3cc06d-cc80-421c-b74d-3951888c169d" containerName="git-clone" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.607625 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f3cc06d-cc80-421c-b74d-3951888c169d" containerName="git-clone" Jan 27 15:59:04 crc kubenswrapper[4713]: E0127 15:59:04.607660 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187d6b27-9093-4df4-b5cd-a89ed60f7d29" containerName="manage-dockerfile" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.607668 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="187d6b27-9093-4df4-b5cd-a89ed60f7d29" containerName="manage-dockerfile" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.607795 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="187d6b27-9093-4df4-b5cd-a89ed60f7d29" containerName="manage-dockerfile" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.607815 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f3cc06d-cc80-421c-b74d-3951888c169d" containerName="git-clone" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.608879 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.611481 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-global-ca" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.611492 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-5v6b8" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.611521 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-ca" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.611534 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-3-sys-config" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.626620 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.704265 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.704353 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.704403 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.704426 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/afa2170f-ce93-4ef5-a13f-db23e1867eb9-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.704453 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.704556 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/afa2170f-ce93-4ef5-a13f-db23e1867eb9-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.704582 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.704604 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.704619 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/afa2170f-ce93-4ef5-a13f-db23e1867eb9-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.704723 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/afa2170f-ce93-4ef5-a13f-db23e1867eb9-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.704804 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.704831 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxv6v\" (UniqueName: \"kubernetes.io/projected/afa2170f-ce93-4ef5-a13f-db23e1867eb9-kube-api-access-xxv6v\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.805860 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.805920 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.805958 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.805976 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/afa2170f-ce93-4ef5-a13f-db23e1867eb9-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.805999 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.806025 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/afa2170f-ce93-4ef5-a13f-db23e1867eb9-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.806060 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.806085 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.806103 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/afa2170f-ce93-4ef5-a13f-db23e1867eb9-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.806127 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/afa2170f-ce93-4ef5-a13f-db23e1867eb9-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.806150 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.806171 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxv6v\" (UniqueName: \"kubernetes.io/projected/afa2170f-ce93-4ef5-a13f-db23e1867eb9-kube-api-access-xxv6v\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.806192 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/afa2170f-ce93-4ef5-a13f-db23e1867eb9-buildcachedir\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.806282 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/afa2170f-ce93-4ef5-a13f-db23e1867eb9-node-pullsecrets\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.806922 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-container-storage-root\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.807571 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-proxy-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.807623 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-container-storage-run\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.807660 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-system-configs\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.807788 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-ca-bundles\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.808141 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-blob-cache\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.808248 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-buildworkdir\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.817911 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/afa2170f-ce93-4ef5-a13f-db23e1867eb9-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.820201 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/afa2170f-ce93-4ef5-a13f-db23e1867eb9-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.836027 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxv6v\" (UniqueName: \"kubernetes.io/projected/afa2170f-ce93-4ef5-a13f-db23e1867eb9-kube-api-access-xxv6v\") pod \"service-telemetry-operator-3-build\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:04 crc kubenswrapper[4713]: I0127 15:59:04.926689 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:05 crc kubenswrapper[4713]: I0127 15:59:05.336392 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 27 15:59:06 crc kubenswrapper[4713]: I0127 15:59:06.089344 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"afa2170f-ce93-4ef5-a13f-db23e1867eb9","Type":"ContainerStarted","Data":"00259abb746f3fab7500c6c149cf1106cf36c288736fe2c35f8fd8d3ff0c92fd"} Jan 27 15:59:06 crc kubenswrapper[4713]: I0127 15:59:06.090423 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"afa2170f-ce93-4ef5-a13f-db23e1867eb9","Type":"ContainerStarted","Data":"0ac257f689debc38f1aed5365ed6ce638b0e33559664c68b12b72c7b5f4fd216"} Jan 27 15:59:06 crc kubenswrapper[4713]: E0127 15:59:06.147953 4713 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=9140154571024215360, SKID=, AKID=10:1E:84:9E:B5:F4:AB:1F:47:8D:17:2E:A5:84:11:11:5E:8E:8C:8B failed: x509: certificate signed by unknown authority" Jan 27 15:59:07 crc kubenswrapper[4713]: I0127 15:59:07.173950 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.103822 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-3-build" podUID="afa2170f-ce93-4ef5-a13f-db23e1867eb9" containerName="git-clone" containerID="cri-o://00259abb746f3fab7500c6c149cf1106cf36c288736fe2c35f8fd8d3ff0c92fd" gracePeriod=30 Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.505850 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-3-build_afa2170f-ce93-4ef5-a13f-db23e1867eb9/git-clone/0.log" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.506338 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.675013 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/afa2170f-ce93-4ef5-a13f-db23e1867eb9-node-pullsecrets\") pod \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.675123 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-proxy-ca-bundles\") pod \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.675171 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/afa2170f-ce93-4ef5-a13f-db23e1867eb9-builder-dockercfg-5v6b8-pull\") pod \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.675197 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/afa2170f-ce93-4ef5-a13f-db23e1867eb9-buildcachedir\") pod \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.675244 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-container-storage-root\") pod \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.675320 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-container-storage-run\") pod \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.675360 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxv6v\" (UniqueName: \"kubernetes.io/projected/afa2170f-ce93-4ef5-a13f-db23e1867eb9-kube-api-access-xxv6v\") pod \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.675362 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afa2170f-ce93-4ef5-a13f-db23e1867eb9-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "afa2170f-ce93-4ef5-a13f-db23e1867eb9" (UID: "afa2170f-ce93-4ef5-a13f-db23e1867eb9"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.675385 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/afa2170f-ce93-4ef5-a13f-db23e1867eb9-builder-dockercfg-5v6b8-push\") pod \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.675535 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-system-configs\") pod \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.675599 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-blob-cache\") pod \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.675930 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "afa2170f-ce93-4ef5-a13f-db23e1867eb9" (UID: "afa2170f-ce93-4ef5-a13f-db23e1867eb9"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.675961 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "afa2170f-ce93-4ef5-a13f-db23e1867eb9" (UID: "afa2170f-ce93-4ef5-a13f-db23e1867eb9"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.675279 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afa2170f-ce93-4ef5-a13f-db23e1867eb9-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "afa2170f-ce93-4ef5-a13f-db23e1867eb9" (UID: "afa2170f-ce93-4ef5-a13f-db23e1867eb9"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.676014 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "afa2170f-ce93-4ef5-a13f-db23e1867eb9" (UID: "afa2170f-ce93-4ef5-a13f-db23e1867eb9"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.676006 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "afa2170f-ce93-4ef5-a13f-db23e1867eb9" (UID: "afa2170f-ce93-4ef5-a13f-db23e1867eb9"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.676264 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "afa2170f-ce93-4ef5-a13f-db23e1867eb9" (UID: "afa2170f-ce93-4ef5-a13f-db23e1867eb9"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.676431 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-ca-bundles\") pod \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.677116 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "afa2170f-ce93-4ef5-a13f-db23e1867eb9" (UID: "afa2170f-ce93-4ef5-a13f-db23e1867eb9"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.677188 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-buildworkdir\") pod \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\" (UID: \"afa2170f-ce93-4ef5-a13f-db23e1867eb9\") " Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.677808 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/afa2170f-ce93-4ef5-a13f-db23e1867eb9-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.677847 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.677864 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/afa2170f-ce93-4ef5-a13f-db23e1867eb9-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.677881 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.677895 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.677908 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.677922 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.677935 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afa2170f-ce93-4ef5-a13f-db23e1867eb9-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.677861 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "afa2170f-ce93-4ef5-a13f-db23e1867eb9" (UID: "afa2170f-ce93-4ef5-a13f-db23e1867eb9"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.680899 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afa2170f-ce93-4ef5-a13f-db23e1867eb9-kube-api-access-xxv6v" (OuterVolumeSpecName: "kube-api-access-xxv6v") pod "afa2170f-ce93-4ef5-a13f-db23e1867eb9" (UID: "afa2170f-ce93-4ef5-a13f-db23e1867eb9"). InnerVolumeSpecName "kube-api-access-xxv6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.681377 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afa2170f-ce93-4ef5-a13f-db23e1867eb9-builder-dockercfg-5v6b8-push" (OuterVolumeSpecName: "builder-dockercfg-5v6b8-push") pod "afa2170f-ce93-4ef5-a13f-db23e1867eb9" (UID: "afa2170f-ce93-4ef5-a13f-db23e1867eb9"). InnerVolumeSpecName "builder-dockercfg-5v6b8-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.681650 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afa2170f-ce93-4ef5-a13f-db23e1867eb9-builder-dockercfg-5v6b8-pull" (OuterVolumeSpecName: "builder-dockercfg-5v6b8-pull") pod "afa2170f-ce93-4ef5-a13f-db23e1867eb9" (UID: "afa2170f-ce93-4ef5-a13f-db23e1867eb9"). InnerVolumeSpecName "builder-dockercfg-5v6b8-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.779072 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/afa2170f-ce93-4ef5-a13f-db23e1867eb9-builder-dockercfg-5v6b8-pull\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.779119 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxv6v\" (UniqueName: \"kubernetes.io/projected/afa2170f-ce93-4ef5-a13f-db23e1867eb9-kube-api-access-xxv6v\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.779128 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/afa2170f-ce93-4ef5-a13f-db23e1867eb9-builder-dockercfg-5v6b8-push\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:08 crc kubenswrapper[4713]: I0127 15:59:08.779139 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/afa2170f-ce93-4ef5-a13f-db23e1867eb9-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:09 crc kubenswrapper[4713]: I0127 15:59:09.111909 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-3-build_afa2170f-ce93-4ef5-a13f-db23e1867eb9/git-clone/0.log" Jan 27 15:59:09 crc kubenswrapper[4713]: I0127 15:59:09.112474 4713 generic.go:334] "Generic (PLEG): container finished" podID="afa2170f-ce93-4ef5-a13f-db23e1867eb9" containerID="00259abb746f3fab7500c6c149cf1106cf36c288736fe2c35f8fd8d3ff0c92fd" exitCode=1 Jan 27 15:59:09 crc kubenswrapper[4713]: I0127 15:59:09.112531 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"afa2170f-ce93-4ef5-a13f-db23e1867eb9","Type":"ContainerDied","Data":"00259abb746f3fab7500c6c149cf1106cf36c288736fe2c35f8fd8d3ff0c92fd"} Jan 27 15:59:09 crc kubenswrapper[4713]: I0127 15:59:09.112568 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-3-build" event={"ID":"afa2170f-ce93-4ef5-a13f-db23e1867eb9","Type":"ContainerDied","Data":"0ac257f689debc38f1aed5365ed6ce638b0e33559664c68b12b72c7b5f4fd216"} Jan 27 15:59:09 crc kubenswrapper[4713]: I0127 15:59:09.112592 4713 scope.go:117] "RemoveContainer" containerID="00259abb746f3fab7500c6c149cf1106cf36c288736fe2c35f8fd8d3ff0c92fd" Jan 27 15:59:09 crc kubenswrapper[4713]: I0127 15:59:09.112671 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-3-build" Jan 27 15:59:09 crc kubenswrapper[4713]: I0127 15:59:09.137866 4713 scope.go:117] "RemoveContainer" containerID="00259abb746f3fab7500c6c149cf1106cf36c288736fe2c35f8fd8d3ff0c92fd" Jan 27 15:59:09 crc kubenswrapper[4713]: I0127 15:59:09.138856 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 27 15:59:09 crc kubenswrapper[4713]: E0127 15:59:09.140290 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00259abb746f3fab7500c6c149cf1106cf36c288736fe2c35f8fd8d3ff0c92fd\": container with ID starting with 00259abb746f3fab7500c6c149cf1106cf36c288736fe2c35f8fd8d3ff0c92fd not found: ID does not exist" containerID="00259abb746f3fab7500c6c149cf1106cf36c288736fe2c35f8fd8d3ff0c92fd" Jan 27 15:59:09 crc kubenswrapper[4713]: I0127 15:59:09.140340 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00259abb746f3fab7500c6c149cf1106cf36c288736fe2c35f8fd8d3ff0c92fd"} err="failed to get container status \"00259abb746f3fab7500c6c149cf1106cf36c288736fe2c35f8fd8d3ff0c92fd\": rpc error: code = NotFound desc = could not find container \"00259abb746f3fab7500c6c149cf1106cf36c288736fe2c35f8fd8d3ff0c92fd\": container with ID starting with 00259abb746f3fab7500c6c149cf1106cf36c288736fe2c35f8fd8d3ff0c92fd not found: ID does not exist" Jan 27 15:59:09 crc kubenswrapper[4713]: I0127 15:59:09.145158 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-3-build"] Jan 27 15:59:10 crc kubenswrapper[4713]: I0127 15:59:10.909384 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afa2170f-ce93-4ef5-a13f-db23e1867eb9" path="/var/lib/kubelet/pods/afa2170f-ce93-4ef5-a13f-db23e1867eb9/volumes" Jan 27 15:59:12 crc kubenswrapper[4713]: I0127 15:59:12.555560 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:59:12 crc kubenswrapper[4713]: I0127 15:59:12.556006 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.601611 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 27 15:59:18 crc kubenswrapper[4713]: E0127 15:59:18.602656 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afa2170f-ce93-4ef5-a13f-db23e1867eb9" containerName="git-clone" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.602680 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="afa2170f-ce93-4ef5-a13f-db23e1867eb9" containerName="git-clone" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.602826 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="afa2170f-ce93-4ef5-a13f-db23e1867eb9" containerName="git-clone" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.603765 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.606152 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-global-ca" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.606177 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-5v6b8" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.607519 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-ca" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.608195 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-4-sys-config" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.623677 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.714112 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/29eaf9b1-c633-4d90-ad08-e9a5608de93c-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.714156 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.714184 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/29eaf9b1-c633-4d90-ad08-e9a5608de93c-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.714203 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.714221 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.714244 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.714433 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/29eaf9b1-c633-4d90-ad08-e9a5608de93c-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.714521 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/29eaf9b1-c633-4d90-ad08-e9a5608de93c-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.714574 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.714680 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz7sv\" (UniqueName: \"kubernetes.io/projected/29eaf9b1-c633-4d90-ad08-e9a5608de93c-kube-api-access-mz7sv\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.714761 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.714818 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.816014 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.816100 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/29eaf9b1-c633-4d90-ad08-e9a5608de93c-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.816123 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.816276 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.816300 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/29eaf9b1-c633-4d90-ad08-e9a5608de93c-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.816326 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/29eaf9b1-c633-4d90-ad08-e9a5608de93c-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.816426 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/29eaf9b1-c633-4d90-ad08-e9a5608de93c-buildcachedir\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.816623 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-container-storage-root\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.817159 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.816349 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-proxy-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.817398 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-buildworkdir\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.817525 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz7sv\" (UniqueName: \"kubernetes.io/projected/29eaf9b1-c633-4d90-ad08-e9a5608de93c-kube-api-access-mz7sv\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.817554 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-blob-cache\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.817636 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.817706 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.817752 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/29eaf9b1-c633-4d90-ad08-e9a5608de93c-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.817785 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.818130 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/29eaf9b1-c633-4d90-ad08-e9a5608de93c-node-pullsecrets\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.818160 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-system-configs\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.818175 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-container-storage-run\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.818911 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-ca-bundles\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.822488 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/29eaf9b1-c633-4d90-ad08-e9a5608de93c-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.824648 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/29eaf9b1-c633-4d90-ad08-e9a5608de93c-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.835892 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz7sv\" (UniqueName: \"kubernetes.io/projected/29eaf9b1-c633-4d90-ad08-e9a5608de93c-kube-api-access-mz7sv\") pod \"service-telemetry-operator-4-build\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:18 crc kubenswrapper[4713]: I0127 15:59:18.920451 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:19 crc kubenswrapper[4713]: I0127 15:59:19.194586 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 27 15:59:20 crc kubenswrapper[4713]: I0127 15:59:20.209870 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"29eaf9b1-c633-4d90-ad08-e9a5608de93c","Type":"ContainerStarted","Data":"935a6bda4149cad0946556ef5ab595ca29e3430b29d1c15164591b38501b99fc"} Jan 27 15:59:20 crc kubenswrapper[4713]: I0127 15:59:20.210292 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"29eaf9b1-c633-4d90-ad08-e9a5608de93c","Type":"ContainerStarted","Data":"412f4a0e5615b0fb862ee221c09760263af97319fd1c95d9365d55d3d986b7ff"} Jan 27 15:59:20 crc kubenswrapper[4713]: E0127 15:59:20.263497 4713 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=9140154571024215360, SKID=, AKID=10:1E:84:9E:B5:F4:AB:1F:47:8D:17:2E:A5:84:11:11:5E:8E:8C:8B failed: x509: certificate signed by unknown authority" Jan 27 15:59:21 crc kubenswrapper[4713]: I0127 15:59:21.310951 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.220396 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-4-build" podUID="29eaf9b1-c633-4d90-ad08-e9a5608de93c" containerName="git-clone" containerID="cri-o://935a6bda4149cad0946556ef5ab595ca29e3430b29d1c15164591b38501b99fc" gracePeriod=30 Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.575079 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-4-build_29eaf9b1-c633-4d90-ad08-e9a5608de93c/git-clone/0.log" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.575684 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.668675 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/29eaf9b1-c633-4d90-ad08-e9a5608de93c-builder-dockercfg-5v6b8-push\") pod \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.668746 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/29eaf9b1-c633-4d90-ad08-e9a5608de93c-node-pullsecrets\") pod \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.668776 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-container-storage-run\") pod \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.668807 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-buildworkdir\") pod \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.668850 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz7sv\" (UniqueName: \"kubernetes.io/projected/29eaf9b1-c633-4d90-ad08-e9a5608de93c-kube-api-access-mz7sv\") pod \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.668869 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/29eaf9b1-c633-4d90-ad08-e9a5608de93c-buildcachedir\") pod \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.668887 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-system-configs\") pod \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.668913 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-blob-cache\") pod \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.668936 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/29eaf9b1-c633-4d90-ad08-e9a5608de93c-builder-dockercfg-5v6b8-pull\") pod \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.668956 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-ca-bundles\") pod \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.668982 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-proxy-ca-bundles\") pod \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.669033 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-container-storage-root\") pod \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\" (UID: \"29eaf9b1-c633-4d90-ad08-e9a5608de93c\") " Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.669257 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29eaf9b1-c633-4d90-ad08-e9a5608de93c-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "29eaf9b1-c633-4d90-ad08-e9a5608de93c" (UID: "29eaf9b1-c633-4d90-ad08-e9a5608de93c"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.669531 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/29eaf9b1-c633-4d90-ad08-e9a5608de93c-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.669553 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29eaf9b1-c633-4d90-ad08-e9a5608de93c-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "29eaf9b1-c633-4d90-ad08-e9a5608de93c" (UID: "29eaf9b1-c633-4d90-ad08-e9a5608de93c"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.669807 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "29eaf9b1-c633-4d90-ad08-e9a5608de93c" (UID: "29eaf9b1-c633-4d90-ad08-e9a5608de93c"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.670012 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "29eaf9b1-c633-4d90-ad08-e9a5608de93c" (UID: "29eaf9b1-c633-4d90-ad08-e9a5608de93c"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.670058 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "29eaf9b1-c633-4d90-ad08-e9a5608de93c" (UID: "29eaf9b1-c633-4d90-ad08-e9a5608de93c"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.670410 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "29eaf9b1-c633-4d90-ad08-e9a5608de93c" (UID: "29eaf9b1-c633-4d90-ad08-e9a5608de93c"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.670418 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "29eaf9b1-c633-4d90-ad08-e9a5608de93c" (UID: "29eaf9b1-c633-4d90-ad08-e9a5608de93c"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.670480 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "29eaf9b1-c633-4d90-ad08-e9a5608de93c" (UID: "29eaf9b1-c633-4d90-ad08-e9a5608de93c"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.670724 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "29eaf9b1-c633-4d90-ad08-e9a5608de93c" (UID: "29eaf9b1-c633-4d90-ad08-e9a5608de93c"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.675549 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29eaf9b1-c633-4d90-ad08-e9a5608de93c-builder-dockercfg-5v6b8-push" (OuterVolumeSpecName: "builder-dockercfg-5v6b8-push") pod "29eaf9b1-c633-4d90-ad08-e9a5608de93c" (UID: "29eaf9b1-c633-4d90-ad08-e9a5608de93c"). InnerVolumeSpecName "builder-dockercfg-5v6b8-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.675764 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/29eaf9b1-c633-4d90-ad08-e9a5608de93c-builder-dockercfg-5v6b8-pull" (OuterVolumeSpecName: "builder-dockercfg-5v6b8-pull") pod "29eaf9b1-c633-4d90-ad08-e9a5608de93c" (UID: "29eaf9b1-c633-4d90-ad08-e9a5608de93c"). InnerVolumeSpecName "builder-dockercfg-5v6b8-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.675864 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29eaf9b1-c633-4d90-ad08-e9a5608de93c-kube-api-access-mz7sv" (OuterVolumeSpecName: "kube-api-access-mz7sv") pod "29eaf9b1-c633-4d90-ad08-e9a5608de93c" (UID: "29eaf9b1-c633-4d90-ad08-e9a5608de93c"). InnerVolumeSpecName "kube-api-access-mz7sv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.770895 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/29eaf9b1-c633-4d90-ad08-e9a5608de93c-builder-dockercfg-5v6b8-push\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.770979 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/29eaf9b1-c633-4d90-ad08-e9a5608de93c-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.770993 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.771008 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.771019 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz7sv\" (UniqueName: \"kubernetes.io/projected/29eaf9b1-c633-4d90-ad08-e9a5608de93c-kube-api-access-mz7sv\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.771031 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.771126 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.771139 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/29eaf9b1-c633-4d90-ad08-e9a5608de93c-builder-dockercfg-5v6b8-pull\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.771150 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.771161 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/29eaf9b1-c633-4d90-ad08-e9a5608de93c-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:22 crc kubenswrapper[4713]: I0127 15:59:22.771173 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/29eaf9b1-c633-4d90-ad08-e9a5608de93c-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:23 crc kubenswrapper[4713]: I0127 15:59:23.227604 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-4-build_29eaf9b1-c633-4d90-ad08-e9a5608de93c/git-clone/0.log" Jan 27 15:59:23 crc kubenswrapper[4713]: I0127 15:59:23.227667 4713 generic.go:334] "Generic (PLEG): container finished" podID="29eaf9b1-c633-4d90-ad08-e9a5608de93c" containerID="935a6bda4149cad0946556ef5ab595ca29e3430b29d1c15164591b38501b99fc" exitCode=1 Jan 27 15:59:23 crc kubenswrapper[4713]: I0127 15:59:23.227713 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"29eaf9b1-c633-4d90-ad08-e9a5608de93c","Type":"ContainerDied","Data":"935a6bda4149cad0946556ef5ab595ca29e3430b29d1c15164591b38501b99fc"} Jan 27 15:59:23 crc kubenswrapper[4713]: I0127 15:59:23.227729 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-4-build" Jan 27 15:59:23 crc kubenswrapper[4713]: I0127 15:59:23.227754 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-4-build" event={"ID":"29eaf9b1-c633-4d90-ad08-e9a5608de93c","Type":"ContainerDied","Data":"412f4a0e5615b0fb862ee221c09760263af97319fd1c95d9365d55d3d986b7ff"} Jan 27 15:59:23 crc kubenswrapper[4713]: I0127 15:59:23.227783 4713 scope.go:117] "RemoveContainer" containerID="935a6bda4149cad0946556ef5ab595ca29e3430b29d1c15164591b38501b99fc" Jan 27 15:59:23 crc kubenswrapper[4713]: I0127 15:59:23.249519 4713 scope.go:117] "RemoveContainer" containerID="935a6bda4149cad0946556ef5ab595ca29e3430b29d1c15164591b38501b99fc" Jan 27 15:59:23 crc kubenswrapper[4713]: E0127 15:59:23.250303 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"935a6bda4149cad0946556ef5ab595ca29e3430b29d1c15164591b38501b99fc\": container with ID starting with 935a6bda4149cad0946556ef5ab595ca29e3430b29d1c15164591b38501b99fc not found: ID does not exist" containerID="935a6bda4149cad0946556ef5ab595ca29e3430b29d1c15164591b38501b99fc" Jan 27 15:59:23 crc kubenswrapper[4713]: I0127 15:59:23.250337 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"935a6bda4149cad0946556ef5ab595ca29e3430b29d1c15164591b38501b99fc"} err="failed to get container status \"935a6bda4149cad0946556ef5ab595ca29e3430b29d1c15164591b38501b99fc\": rpc error: code = NotFound desc = could not find container \"935a6bda4149cad0946556ef5ab595ca29e3430b29d1c15164591b38501b99fc\": container with ID starting with 935a6bda4149cad0946556ef5ab595ca29e3430b29d1c15164591b38501b99fc not found: ID does not exist" Jan 27 15:59:23 crc kubenswrapper[4713]: I0127 15:59:23.259919 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 27 15:59:23 crc kubenswrapper[4713]: I0127 15:59:23.266337 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-4-build"] Jan 27 15:59:24 crc kubenswrapper[4713]: I0127 15:59:24.906942 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29eaf9b1-c633-4d90-ad08-e9a5608de93c" path="/var/lib/kubelet/pods/29eaf9b1-c633-4d90-ad08-e9a5608de93c/volumes" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.741008 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 27 15:59:32 crc kubenswrapper[4713]: E0127 15:59:32.741953 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29eaf9b1-c633-4d90-ad08-e9a5608de93c" containerName="git-clone" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.741966 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="29eaf9b1-c633-4d90-ad08-e9a5608de93c" containerName="git-clone" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.742099 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="29eaf9b1-c633-4d90-ad08-e9a5608de93c" containerName="git-clone" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.742891 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.745605 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-ca" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.745933 4713 reflector.go:368] Caches populated for *v1.Secret from object-"service-telemetry"/"builder-dockercfg-5v6b8" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.745995 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-global-ca" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.747020 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"service-telemetry"/"service-telemetry-operator-5-sys-config" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.761278 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.859864 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.859909 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d1387339-edf1-4b50-842d-7a7353c1b620-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.859931 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.859949 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d1387339-edf1-4b50-842d-7a7353c1b620-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.859972 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/d1387339-edf1-4b50-842d-7a7353c1b620-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.860001 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.860129 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/d1387339-edf1-4b50-842d-7a7353c1b620-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.860171 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.860190 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.860272 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.860295 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.860315 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dvxgt\" (UniqueName: \"kubernetes.io/projected/d1387339-edf1-4b50-842d-7a7353c1b620-kube-api-access-dvxgt\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.961450 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.961533 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/d1387339-edf1-4b50-842d-7a7353c1b620-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.961569 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.961589 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.961626 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.961643 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.961663 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dvxgt\" (UniqueName: \"kubernetes.io/projected/d1387339-edf1-4b50-842d-7a7353c1b620-kube-api-access-dvxgt\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.961698 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.961715 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d1387339-edf1-4b50-842d-7a7353c1b620-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.961734 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.961750 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d1387339-edf1-4b50-842d-7a7353c1b620-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.961772 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/d1387339-edf1-4b50-842d-7a7353c1b620-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.961992 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-buildworkdir\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.962283 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-system-configs\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.962334 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d1387339-edf1-4b50-842d-7a7353c1b620-buildcachedir\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.962406 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-container-storage-root\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.962354 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d1387339-edf1-4b50-842d-7a7353c1b620-node-pullsecrets\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.962425 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-container-storage-run\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.962518 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-build-blob-cache\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.963066 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-proxy-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.963463 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-ca-bundles\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.975682 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/d1387339-edf1-4b50-842d-7a7353c1b620-builder-dockercfg-5v6b8-pull\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.975682 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/d1387339-edf1-4b50-842d-7a7353c1b620-builder-dockercfg-5v6b8-push\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:32 crc kubenswrapper[4713]: I0127 15:59:32.979476 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dvxgt\" (UniqueName: \"kubernetes.io/projected/d1387339-edf1-4b50-842d-7a7353c1b620-kube-api-access-dvxgt\") pod \"service-telemetry-operator-5-build\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:33 crc kubenswrapper[4713]: I0127 15:59:33.117519 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:33 crc kubenswrapper[4713]: I0127 15:59:33.336653 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 27 15:59:34 crc kubenswrapper[4713]: I0127 15:59:34.306883 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"d1387339-edf1-4b50-842d-7a7353c1b620","Type":"ContainerStarted","Data":"0c30a137b50a74952b07f1c1045ffc18c4c4329b576e282cacb56ee2358a89a2"} Jan 27 15:59:34 crc kubenswrapper[4713]: I0127 15:59:34.307437 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"d1387339-edf1-4b50-842d-7a7353c1b620","Type":"ContainerStarted","Data":"36d64e6d7a93f366e02e737c2f9f6fd6acb8507e28aaf6ad36d47cc0a676da49"} Jan 27 15:59:34 crc kubenswrapper[4713]: E0127 15:59:34.390530 4713 server.go:309] "Unable to authenticate the request due to an error" err="verifying certificate SN=9140154571024215360, SKID=, AKID=10:1E:84:9E:B5:F4:AB:1F:47:8D:17:2E:A5:84:11:11:5E:8E:8C:8B failed: x509: certificate signed by unknown authority" Jan 27 15:59:35 crc kubenswrapper[4713]: I0127 15:59:35.420558 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.321143 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="service-telemetry/service-telemetry-operator-5-build" podUID="d1387339-edf1-4b50-842d-7a7353c1b620" containerName="git-clone" containerID="cri-o://0c30a137b50a74952b07f1c1045ffc18c4c4329b576e282cacb56ee2358a89a2" gracePeriod=30 Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.742567 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-5-build_d1387339-edf1-4b50-842d-7a7353c1b620/git-clone/0.log" Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.742681 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.918393 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-container-storage-run\") pod \"d1387339-edf1-4b50-842d-7a7353c1b620\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.918467 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-build-blob-cache\") pod \"d1387339-edf1-4b50-842d-7a7353c1b620\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.918491 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-system-configs\") pod \"d1387339-edf1-4b50-842d-7a7353c1b620\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.918526 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/d1387339-edf1-4b50-842d-7a7353c1b620-builder-dockercfg-5v6b8-push\") pod \"d1387339-edf1-4b50-842d-7a7353c1b620\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.918544 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-buildworkdir\") pod \"d1387339-edf1-4b50-842d-7a7353c1b620\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.918568 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-ca-bundles\") pod \"d1387339-edf1-4b50-842d-7a7353c1b620\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.918589 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d1387339-edf1-4b50-842d-7a7353c1b620-buildcachedir\") pod \"d1387339-edf1-4b50-842d-7a7353c1b620\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.918608 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-proxy-ca-bundles\") pod \"d1387339-edf1-4b50-842d-7a7353c1b620\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.918650 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dvxgt\" (UniqueName: \"kubernetes.io/projected/d1387339-edf1-4b50-842d-7a7353c1b620-kube-api-access-dvxgt\") pod \"d1387339-edf1-4b50-842d-7a7353c1b620\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.918677 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d1387339-edf1-4b50-842d-7a7353c1b620-node-pullsecrets\") pod \"d1387339-edf1-4b50-842d-7a7353c1b620\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.918725 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-container-storage-root\") pod \"d1387339-edf1-4b50-842d-7a7353c1b620\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.918745 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/d1387339-edf1-4b50-842d-7a7353c1b620-builder-dockercfg-5v6b8-pull\") pod \"d1387339-edf1-4b50-842d-7a7353c1b620\" (UID: \"d1387339-edf1-4b50-842d-7a7353c1b620\") " Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.919558 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1387339-edf1-4b50-842d-7a7353c1b620-buildcachedir" (OuterVolumeSpecName: "buildcachedir") pod "d1387339-edf1-4b50-842d-7a7353c1b620" (UID: "d1387339-edf1-4b50-842d-7a7353c1b620"). InnerVolumeSpecName "buildcachedir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.919693 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d1387339-edf1-4b50-842d-7a7353c1b620-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "d1387339-edf1-4b50-842d-7a7353c1b620" (UID: "d1387339-edf1-4b50-842d-7a7353c1b620"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.921701 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-container-storage-run" (OuterVolumeSpecName: "container-storage-run") pod "d1387339-edf1-4b50-842d-7a7353c1b620" (UID: "d1387339-edf1-4b50-842d-7a7353c1b620"). InnerVolumeSpecName "container-storage-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.921839 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-system-configs" (OuterVolumeSpecName: "build-system-configs") pod "d1387339-edf1-4b50-842d-7a7353c1b620" (UID: "d1387339-edf1-4b50-842d-7a7353c1b620"). InnerVolumeSpecName "build-system-configs". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.924275 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-build-blob-cache" (OuterVolumeSpecName: "build-blob-cache") pod "d1387339-edf1-4b50-842d-7a7353c1b620" (UID: "d1387339-edf1-4b50-842d-7a7353c1b620"). InnerVolumeSpecName "build-blob-cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.924806 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-container-storage-root" (OuterVolumeSpecName: "container-storage-root") pod "d1387339-edf1-4b50-842d-7a7353c1b620" (UID: "d1387339-edf1-4b50-842d-7a7353c1b620"). InnerVolumeSpecName "container-storage-root". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.924851 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-ca-bundles" (OuterVolumeSpecName: "build-ca-bundles") pod "d1387339-edf1-4b50-842d-7a7353c1b620" (UID: "d1387339-edf1-4b50-842d-7a7353c1b620"). InnerVolumeSpecName "build-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.924899 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-buildworkdir" (OuterVolumeSpecName: "buildworkdir") pod "d1387339-edf1-4b50-842d-7a7353c1b620" (UID: "d1387339-edf1-4b50-842d-7a7353c1b620"). InnerVolumeSpecName "buildworkdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.925016 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-proxy-ca-bundles" (OuterVolumeSpecName: "build-proxy-ca-bundles") pod "d1387339-edf1-4b50-842d-7a7353c1b620" (UID: "d1387339-edf1-4b50-842d-7a7353c1b620"). InnerVolumeSpecName "build-proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.926546 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1387339-edf1-4b50-842d-7a7353c1b620-builder-dockercfg-5v6b8-push" (OuterVolumeSpecName: "builder-dockercfg-5v6b8-push") pod "d1387339-edf1-4b50-842d-7a7353c1b620" (UID: "d1387339-edf1-4b50-842d-7a7353c1b620"). InnerVolumeSpecName "builder-dockercfg-5v6b8-push". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.927350 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1387339-edf1-4b50-842d-7a7353c1b620-kube-api-access-dvxgt" (OuterVolumeSpecName: "kube-api-access-dvxgt") pod "d1387339-edf1-4b50-842d-7a7353c1b620" (UID: "d1387339-edf1-4b50-842d-7a7353c1b620"). InnerVolumeSpecName "kube-api-access-dvxgt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 15:59:36 crc kubenswrapper[4713]: I0127 15:59:36.930061 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1387339-edf1-4b50-842d-7a7353c1b620-builder-dockercfg-5v6b8-pull" (OuterVolumeSpecName: "builder-dockercfg-5v6b8-pull") pod "d1387339-edf1-4b50-842d-7a7353c1b620" (UID: "d1387339-edf1-4b50-842d-7a7353c1b620"). InnerVolumeSpecName "builder-dockercfg-5v6b8-pull". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.020228 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dvxgt\" (UniqueName: \"kubernetes.io/projected/d1387339-edf1-4b50-842d-7a7353c1b620-kube-api-access-dvxgt\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.020267 4713 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/d1387339-edf1-4b50-842d-7a7353c1b620-node-pullsecrets\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.020277 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-root\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-container-storage-root\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.020286 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-5v6b8-pull\" (UniqueName: \"kubernetes.io/secret/d1387339-edf1-4b50-842d-7a7353c1b620-builder-dockercfg-5v6b8-pull\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.020295 4713 reconciler_common.go:293] "Volume detached for volume \"container-storage-run\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-container-storage-run\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.020305 4713 reconciler_common.go:293] "Volume detached for volume \"build-blob-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-build-blob-cache\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.020314 4713 reconciler_common.go:293] "Volume detached for volume \"build-system-configs\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-system-configs\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.020323 4713 reconciler_common.go:293] "Volume detached for volume \"builder-dockercfg-5v6b8-push\" (UniqueName: \"kubernetes.io/secret/d1387339-edf1-4b50-842d-7a7353c1b620-builder-dockercfg-5v6b8-push\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.020332 4713 reconciler_common.go:293] "Volume detached for volume \"buildworkdir\" (UniqueName: \"kubernetes.io/empty-dir/d1387339-edf1-4b50-842d-7a7353c1b620-buildworkdir\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.020342 4713 reconciler_common.go:293] "Volume detached for volume \"build-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.020353 4713 reconciler_common.go:293] "Volume detached for volume \"buildcachedir\" (UniqueName: \"kubernetes.io/host-path/d1387339-edf1-4b50-842d-7a7353c1b620-buildcachedir\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.020362 4713 reconciler_common.go:293] "Volume detached for volume \"build-proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/d1387339-edf1-4b50-842d-7a7353c1b620-build-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.360394 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/service-telemetry_service-telemetry-operator-5-build_d1387339-edf1-4b50-842d-7a7353c1b620/git-clone/0.log" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.360765 4713 generic.go:334] "Generic (PLEG): container finished" podID="d1387339-edf1-4b50-842d-7a7353c1b620" containerID="0c30a137b50a74952b07f1c1045ffc18c4c4329b576e282cacb56ee2358a89a2" exitCode=1 Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.360798 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"d1387339-edf1-4b50-842d-7a7353c1b620","Type":"ContainerDied","Data":"0c30a137b50a74952b07f1c1045ffc18c4c4329b576e282cacb56ee2358a89a2"} Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.360827 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="service-telemetry/service-telemetry-operator-5-build" event={"ID":"d1387339-edf1-4b50-842d-7a7353c1b620","Type":"ContainerDied","Data":"36d64e6d7a93f366e02e737c2f9f6fd6acb8507e28aaf6ad36d47cc0a676da49"} Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.360845 4713 scope.go:117] "RemoveContainer" containerID="0c30a137b50a74952b07f1c1045ffc18c4c4329b576e282cacb56ee2358a89a2" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.360994 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="service-telemetry/service-telemetry-operator-5-build" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.397815 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.400679 4713 scope.go:117] "RemoveContainer" containerID="0c30a137b50a74952b07f1c1045ffc18c4c4329b576e282cacb56ee2358a89a2" Jan 27 15:59:37 crc kubenswrapper[4713]: E0127 15:59:37.401093 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c30a137b50a74952b07f1c1045ffc18c4c4329b576e282cacb56ee2358a89a2\": container with ID starting with 0c30a137b50a74952b07f1c1045ffc18c4c4329b576e282cacb56ee2358a89a2 not found: ID does not exist" containerID="0c30a137b50a74952b07f1c1045ffc18c4c4329b576e282cacb56ee2358a89a2" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.401126 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c30a137b50a74952b07f1c1045ffc18c4c4329b576e282cacb56ee2358a89a2"} err="failed to get container status \"0c30a137b50a74952b07f1c1045ffc18c4c4329b576e282cacb56ee2358a89a2\": rpc error: code = NotFound desc = could not find container \"0c30a137b50a74952b07f1c1045ffc18c4c4329b576e282cacb56ee2358a89a2\": container with ID starting with 0c30a137b50a74952b07f1c1045ffc18c4c4329b576e282cacb56ee2358a89a2 not found: ID does not exist" Jan 27 15:59:37 crc kubenswrapper[4713]: I0127 15:59:37.403634 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["service-telemetry/service-telemetry-operator-5-build"] Jan 27 15:59:38 crc kubenswrapper[4713]: I0127 15:59:38.909092 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1387339-edf1-4b50-842d-7a7353c1b620" path="/var/lib/kubelet/pods/d1387339-edf1-4b50-842d-7a7353c1b620/volumes" Jan 27 15:59:42 crc kubenswrapper[4713]: I0127 15:59:42.555413 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 15:59:42 crc kubenswrapper[4713]: I0127 15:59:42.555791 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 15:59:49 crc kubenswrapper[4713]: I0127 15:59:49.427734 4713 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.176425 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2"] Jan 27 16:00:00 crc kubenswrapper[4713]: E0127 16:00:00.179156 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d1387339-edf1-4b50-842d-7a7353c1b620" containerName="git-clone" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.179203 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1387339-edf1-4b50-842d-7a7353c1b620" containerName="git-clone" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.179361 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="d1387339-edf1-4b50-842d-7a7353c1b620" containerName="git-clone" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.180057 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.184174 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.184173 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.187922 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2"] Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.285609 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-secret-volume\") pod \"collect-profiles-29492160-jqxz2\" (UID: \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.285731 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2gvl\" (UniqueName: \"kubernetes.io/projected/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-kube-api-access-q2gvl\") pod \"collect-profiles-29492160-jqxz2\" (UID: \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.285787 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-config-volume\") pod \"collect-profiles-29492160-jqxz2\" (UID: \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.387443 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-config-volume\") pod \"collect-profiles-29492160-jqxz2\" (UID: \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.387580 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-secret-volume\") pod \"collect-profiles-29492160-jqxz2\" (UID: \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.387677 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2gvl\" (UniqueName: \"kubernetes.io/projected/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-kube-api-access-q2gvl\") pod \"collect-profiles-29492160-jqxz2\" (UID: \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.389550 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-config-volume\") pod \"collect-profiles-29492160-jqxz2\" (UID: \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.396735 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-secret-volume\") pod \"collect-profiles-29492160-jqxz2\" (UID: \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.410205 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2gvl\" (UniqueName: \"kubernetes.io/projected/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-kube-api-access-q2gvl\") pod \"collect-profiles-29492160-jqxz2\" (UID: \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.505814 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" Jan 27 16:00:00 crc kubenswrapper[4713]: I0127 16:00:00.961705 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2"] Jan 27 16:00:01 crc kubenswrapper[4713]: I0127 16:00:01.537930 4713 generic.go:334] "Generic (PLEG): container finished" podID="acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a" containerID="be4c5cc8f6334e95acfa83cc19a0c659ba016fdbf28cfb244030888fd5a23152" exitCode=0 Jan 27 16:00:01 crc kubenswrapper[4713]: I0127 16:00:01.538089 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" event={"ID":"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a","Type":"ContainerDied","Data":"be4c5cc8f6334e95acfa83cc19a0c659ba016fdbf28cfb244030888fd5a23152"} Jan 27 16:00:01 crc kubenswrapper[4713]: I0127 16:00:01.540797 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" event={"ID":"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a","Type":"ContainerStarted","Data":"e178430f02f25db90b96c916fe5ae8fb93d93fc61016d9dd61fa1d8bdd34a789"} Jan 27 16:00:02 crc kubenswrapper[4713]: I0127 16:00:02.786747 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" Jan 27 16:00:02 crc kubenswrapper[4713]: I0127 16:00:02.929032 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-secret-volume\") pod \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\" (UID: \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\") " Jan 27 16:00:02 crc kubenswrapper[4713]: I0127 16:00:02.929150 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2gvl\" (UniqueName: \"kubernetes.io/projected/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-kube-api-access-q2gvl\") pod \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\" (UID: \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\") " Jan 27 16:00:02 crc kubenswrapper[4713]: I0127 16:00:02.929248 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-config-volume\") pod \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\" (UID: \"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a\") " Jan 27 16:00:02 crc kubenswrapper[4713]: I0127 16:00:02.930438 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-config-volume" (OuterVolumeSpecName: "config-volume") pod "acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a" (UID: "acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 27 16:00:02 crc kubenswrapper[4713]: I0127 16:00:02.936197 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-kube-api-access-q2gvl" (OuterVolumeSpecName: "kube-api-access-q2gvl") pod "acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a" (UID: "acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a"). InnerVolumeSpecName "kube-api-access-q2gvl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:00:02 crc kubenswrapper[4713]: I0127 16:00:02.941380 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a" (UID: "acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 27 16:00:03 crc kubenswrapper[4713]: I0127 16:00:03.031523 4713 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:03 crc kubenswrapper[4713]: I0127 16:00:03.031567 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2gvl\" (UniqueName: \"kubernetes.io/projected/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-kube-api-access-q2gvl\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:03 crc kubenswrapper[4713]: I0127 16:00:03.031611 4713 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a-config-volume\") on node \"crc\" DevicePath \"\"" Jan 27 16:00:03 crc kubenswrapper[4713]: I0127 16:00:03.556760 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" event={"ID":"acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a","Type":"ContainerDied","Data":"e178430f02f25db90b96c916fe5ae8fb93d93fc61016d9dd61fa1d8bdd34a789"} Jan 27 16:00:03 crc kubenswrapper[4713]: I0127 16:00:03.556806 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29492160-jqxz2" Jan 27 16:00:03 crc kubenswrapper[4713]: I0127 16:00:03.556822 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e178430f02f25db90b96c916fe5ae8fb93d93fc61016d9dd61fa1d8bdd34a789" Jan 27 16:00:12 crc kubenswrapper[4713]: I0127 16:00:12.555206 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:00:12 crc kubenswrapper[4713]: I0127 16:00:12.556019 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:00:12 crc kubenswrapper[4713]: I0127 16:00:12.556199 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 16:00:12 crc kubenswrapper[4713]: I0127 16:00:12.557110 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce9cf5c90b2ef4b5d8bbf232cb22333413a90aeb960b0273756421fdaf75fb8b"} pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 16:00:12 crc kubenswrapper[4713]: I0127 16:00:12.557258 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" containerID="cri-o://ce9cf5c90b2ef4b5d8bbf232cb22333413a90aeb960b0273756421fdaf75fb8b" gracePeriod=600 Jan 27 16:00:13 crc kubenswrapper[4713]: I0127 16:00:13.631359 4713 generic.go:334] "Generic (PLEG): container finished" podID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerID="ce9cf5c90b2ef4b5d8bbf232cb22333413a90aeb960b0273756421fdaf75fb8b" exitCode=0 Jan 27 16:00:13 crc kubenswrapper[4713]: I0127 16:00:13.631520 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerDied","Data":"ce9cf5c90b2ef4b5d8bbf232cb22333413a90aeb960b0273756421fdaf75fb8b"} Jan 27 16:00:13 crc kubenswrapper[4713]: I0127 16:00:13.632135 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerStarted","Data":"560d0585a56a5323a4b025a75a383c0d30c7c995c13b04b0d45b84883ee39f33"} Jan 27 16:00:13 crc kubenswrapper[4713]: I0127 16:00:13.632166 4713 scope.go:117] "RemoveContainer" containerID="fb203d4d7d03d6fc61ad68abfdb43cfb6ffef5fcf6fbd60deeab9e3acb4b4835" Jan 27 16:00:22 crc kubenswrapper[4713]: I0127 16:00:22.611363 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-d6f95/must-gather-t4stn"] Jan 27 16:00:22 crc kubenswrapper[4713]: E0127 16:00:22.612151 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a" containerName="collect-profiles" Jan 27 16:00:22 crc kubenswrapper[4713]: I0127 16:00:22.612165 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a" containerName="collect-profiles" Jan 27 16:00:22 crc kubenswrapper[4713]: I0127 16:00:22.612282 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="acde67ab-b1c3-4db7-bbe8-b2d9c5b58c1a" containerName="collect-profiles" Jan 27 16:00:22 crc kubenswrapper[4713]: I0127 16:00:22.612887 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6f95/must-gather-t4stn" Jan 27 16:00:22 crc kubenswrapper[4713]: I0127 16:00:22.616884 4713 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-d6f95"/"default-dockercfg-x8zwk" Jan 27 16:00:22 crc kubenswrapper[4713]: I0127 16:00:22.617406 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d6f95"/"openshift-service-ca.crt" Jan 27 16:00:22 crc kubenswrapper[4713]: I0127 16:00:22.621108 4713 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-d6f95"/"kube-root-ca.crt" Jan 27 16:00:22 crc kubenswrapper[4713]: I0127 16:00:22.640838 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d6f95/must-gather-t4stn"] Jan 27 16:00:22 crc kubenswrapper[4713]: I0127 16:00:22.822594 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/414acf6a-9608-469d-a042-ac94bc172381-must-gather-output\") pod \"must-gather-t4stn\" (UID: \"414acf6a-9608-469d-a042-ac94bc172381\") " pod="openshift-must-gather-d6f95/must-gather-t4stn" Jan 27 16:00:22 crc kubenswrapper[4713]: I0127 16:00:22.823027 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klvgf\" (UniqueName: \"kubernetes.io/projected/414acf6a-9608-469d-a042-ac94bc172381-kube-api-access-klvgf\") pod \"must-gather-t4stn\" (UID: \"414acf6a-9608-469d-a042-ac94bc172381\") " pod="openshift-must-gather-d6f95/must-gather-t4stn" Jan 27 16:00:22 crc kubenswrapper[4713]: I0127 16:00:22.923547 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klvgf\" (UniqueName: \"kubernetes.io/projected/414acf6a-9608-469d-a042-ac94bc172381-kube-api-access-klvgf\") pod \"must-gather-t4stn\" (UID: \"414acf6a-9608-469d-a042-ac94bc172381\") " pod="openshift-must-gather-d6f95/must-gather-t4stn" Jan 27 16:00:22 crc kubenswrapper[4713]: I0127 16:00:22.923652 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/414acf6a-9608-469d-a042-ac94bc172381-must-gather-output\") pod \"must-gather-t4stn\" (UID: \"414acf6a-9608-469d-a042-ac94bc172381\") " pod="openshift-must-gather-d6f95/must-gather-t4stn" Jan 27 16:00:22 crc kubenswrapper[4713]: I0127 16:00:22.924033 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/414acf6a-9608-469d-a042-ac94bc172381-must-gather-output\") pod \"must-gather-t4stn\" (UID: \"414acf6a-9608-469d-a042-ac94bc172381\") " pod="openshift-must-gather-d6f95/must-gather-t4stn" Jan 27 16:00:22 crc kubenswrapper[4713]: I0127 16:00:22.973550 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klvgf\" (UniqueName: \"kubernetes.io/projected/414acf6a-9608-469d-a042-ac94bc172381-kube-api-access-klvgf\") pod \"must-gather-t4stn\" (UID: \"414acf6a-9608-469d-a042-ac94bc172381\") " pod="openshift-must-gather-d6f95/must-gather-t4stn" Jan 27 16:00:23 crc kubenswrapper[4713]: I0127 16:00:23.232616 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6f95/must-gather-t4stn" Jan 27 16:00:23 crc kubenswrapper[4713]: I0127 16:00:23.421381 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-d6f95/must-gather-t4stn"] Jan 27 16:00:23 crc kubenswrapper[4713]: I0127 16:00:23.701779 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6f95/must-gather-t4stn" event={"ID":"414acf6a-9608-469d-a042-ac94bc172381","Type":"ContainerStarted","Data":"2a7459543c4639d099b33089660cd22ca2decbb76144ea31ca8a58696d0739d1"} Jan 27 16:00:36 crc kubenswrapper[4713]: I0127 16:00:36.790397 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6f95/must-gather-t4stn" event={"ID":"414acf6a-9608-469d-a042-ac94bc172381","Type":"ContainerStarted","Data":"bde0fbdfa373f3d56d4442cedcd0e711596e3719d5e4e282e403c7000a16d9dc"} Jan 27 16:00:36 crc kubenswrapper[4713]: I0127 16:00:36.791232 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6f95/must-gather-t4stn" event={"ID":"414acf6a-9608-469d-a042-ac94bc172381","Type":"ContainerStarted","Data":"5d80504e0a0826e85ad9319fe515d07042ae20fc6a4d468e0a76f97bb4bca393"} Jan 27 16:00:36 crc kubenswrapper[4713]: I0127 16:00:36.808297 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-d6f95/must-gather-t4stn" podStartSLOduration=2.158933609 podStartE2EDuration="14.808269494s" podCreationTimestamp="2026-01-27 16:00:22 +0000 UTC" firstStartedPulling="2026-01-27 16:00:23.432001296 +0000 UTC m=+791.210211234" lastFinishedPulling="2026-01-27 16:00:36.081337171 +0000 UTC m=+803.859547119" observedRunningTime="2026-01-27 16:00:36.805532485 +0000 UTC m=+804.583742433" watchObservedRunningTime="2026-01-27 16:00:36.808269494 +0000 UTC m=+804.586479432" Jan 27 16:01:15 crc kubenswrapper[4713]: I0127 16:01:15.029681 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-rznpq_c6936ff4-89a3-4627-aa92-34a950e2ee0f/control-plane-machine-set-operator/0.log" Jan 27 16:01:15 crc kubenswrapper[4713]: I0127 16:01:15.161500 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4sxh8_e411ddc4-0c8a-4cae-b08d-264bae41dffc/kube-rbac-proxy/0.log" Jan 27 16:01:15 crc kubenswrapper[4713]: I0127 16:01:15.213507 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-4sxh8_e411ddc4-0c8a-4cae-b08d-264bae41dffc/machine-api-operator/0.log" Jan 27 16:01:26 crc kubenswrapper[4713]: I0127 16:01:26.877703 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-86cb77c54b-jcngs_496fd2b2-f168-4003-8dff-5bec14a42142/cert-manager-controller/0.log" Jan 27 16:01:27 crc kubenswrapper[4713]: I0127 16:01:27.038825 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-855d9ccff4-nx2kd_d3b82c14-e19f-4811-ad1b-c381f5629a67/cert-manager-cainjector/0.log" Jan 27 16:01:27 crc kubenswrapper[4713]: I0127 16:01:27.064684 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-f4fb5df64-llgg9_81a892d6-5f04-4cbd-b78b-2f1ca484b3a3/cert-manager-webhook/0.log" Jan 27 16:01:42 crc kubenswrapper[4713]: I0127 16:01:42.126825 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-gjtdw_09bbc2b2-7f95-4927-978a-96873d1737ad/prometheus-operator/0.log" Jan 27 16:01:42 crc kubenswrapper[4713]: I0127 16:01:42.256806 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn_341a77a0-e923-4418-a9fd-000eb6f5853d/prometheus-operator-admission-webhook/0.log" Jan 27 16:01:42 crc kubenswrapper[4713]: I0127 16:01:42.390931 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8_fd5e4f47-aa65-4231-989b-26060dc08c35/prometheus-operator-admission-webhook/0.log" Jan 27 16:01:42 crc kubenswrapper[4713]: I0127 16:01:42.521267 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-vfldl_a96f9484-508a-4432-9364-af9abac0a60e/operator/0.log" Jan 27 16:01:42 crc kubenswrapper[4713]: I0127 16:01:42.626319 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-l9flh_21e51ea9-6185-4ea8-bac3-032447d9da0c/perses-operator/0.log" Jan 27 16:01:58 crc kubenswrapper[4713]: I0127 16:01:58.327138 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq_97ef285c-4b33-434e-8252-fca768d794de/util/0.log" Jan 27 16:01:58 crc kubenswrapper[4713]: I0127 16:01:58.562296 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq_97ef285c-4b33-434e-8252-fca768d794de/util/0.log" Jan 27 16:01:58 crc kubenswrapper[4713]: I0127 16:01:58.610433 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq_97ef285c-4b33-434e-8252-fca768d794de/pull/0.log" Jan 27 16:01:58 crc kubenswrapper[4713]: I0127 16:01:58.629678 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq_97ef285c-4b33-434e-8252-fca768d794de/pull/0.log" Jan 27 16:01:58 crc kubenswrapper[4713]: I0127 16:01:58.772608 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq_97ef285c-4b33-434e-8252-fca768d794de/util/0.log" Jan 27 16:01:58 crc kubenswrapper[4713]: I0127 16:01:58.826167 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq_97ef285c-4b33-434e-8252-fca768d794de/pull/0.log" Jan 27 16:01:58 crc kubenswrapper[4713]: I0127 16:01:58.847870 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1f59f640c8a0eb1a7b0f26c81382bbdde784d03eb439a940bb8da3931awc8zq_97ef285c-4b33-434e-8252-fca768d794de/extract/0.log" Jan 27 16:01:59 crc kubenswrapper[4713]: I0127 16:01:59.027094 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq_8774e190-04c5-404d-b6e9-a86e93f0d0af/util/0.log" Jan 27 16:01:59 crc kubenswrapper[4713]: I0127 16:01:59.207876 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq_8774e190-04c5-404d-b6e9-a86e93f0d0af/pull/0.log" Jan 27 16:01:59 crc kubenswrapper[4713]: I0127 16:01:59.208191 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq_8774e190-04c5-404d-b6e9-a86e93f0d0af/util/0.log" Jan 27 16:01:59 crc kubenswrapper[4713]: I0127 16:01:59.212539 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq_8774e190-04c5-404d-b6e9-a86e93f0d0af/pull/0.log" Jan 27 16:01:59 crc kubenswrapper[4713]: I0127 16:01:59.431744 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq_8774e190-04c5-404d-b6e9-a86e93f0d0af/pull/0.log" Jan 27 16:01:59 crc kubenswrapper[4713]: I0127 16:01:59.454785 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq_8774e190-04c5-404d-b6e9-a86e93f0d0af/util/0.log" Jan 27 16:01:59 crc kubenswrapper[4713]: I0127 16:01:59.484275 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_6e3e74c24700cc2bb66271d960117ff0976dc779e6a3bc37905b952e8fs8fpq_8774e190-04c5-404d-b6e9-a86e93f0d0af/extract/0.log" Jan 27 16:01:59 crc kubenswrapper[4713]: I0127 16:01:59.623128 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw_68a17f8c-4beb-4779-9917-302efd887cf8/util/0.log" Jan 27 16:01:59 crc kubenswrapper[4713]: I0127 16:01:59.779527 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw_68a17f8c-4beb-4779-9917-302efd887cf8/util/0.log" Jan 27 16:01:59 crc kubenswrapper[4713]: I0127 16:01:59.814668 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw_68a17f8c-4beb-4779-9917-302efd887cf8/pull/0.log" Jan 27 16:01:59 crc kubenswrapper[4713]: I0127 16:01:59.853435 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw_68a17f8c-4beb-4779-9917-302efd887cf8/pull/0.log" Jan 27 16:02:00 crc kubenswrapper[4713]: I0127 16:02:00.051999 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw_68a17f8c-4beb-4779-9917-302efd887cf8/pull/0.log" Jan 27 16:02:00 crc kubenswrapper[4713]: I0127 16:02:00.061351 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw_68a17f8c-4beb-4779-9917-302efd887cf8/extract/0.log" Jan 27 16:02:00 crc kubenswrapper[4713]: I0127 16:02:00.067285 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_8ed862a309935d5a1c8012df79b93f7fb46e029d4689f7f6ddcb9e7f5esd6cw_68a17f8c-4beb-4779-9917-302efd887cf8/util/0.log" Jan 27 16:02:00 crc kubenswrapper[4713]: I0127 16:02:00.236165 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc_1492d427-0396-4def-8421-920adf90b954/util/0.log" Jan 27 16:02:00 crc kubenswrapper[4713]: I0127 16:02:00.420091 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc_1492d427-0396-4def-8421-920adf90b954/util/0.log" Jan 27 16:02:00 crc kubenswrapper[4713]: I0127 16:02:00.426729 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc_1492d427-0396-4def-8421-920adf90b954/pull/0.log" Jan 27 16:02:00 crc kubenswrapper[4713]: I0127 16:02:00.441962 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc_1492d427-0396-4def-8421-920adf90b954/pull/0.log" Jan 27 16:02:00 crc kubenswrapper[4713]: I0127 16:02:00.604016 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc_1492d427-0396-4def-8421-920adf90b954/util/0.log" Jan 27 16:02:00 crc kubenswrapper[4713]: I0127 16:02:00.610455 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc_1492d427-0396-4def-8421-920adf90b954/pull/0.log" Jan 27 16:02:00 crc kubenswrapper[4713]: I0127 16:02:00.613380 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f087xbdc_1492d427-0396-4def-8421-920adf90b954/extract/0.log" Jan 27 16:02:00 crc kubenswrapper[4713]: I0127 16:02:00.782125 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w2fhp_ebb9c0b9-bd38-4496-990a-ab1d02cc792b/extract-utilities/0.log" Jan 27 16:02:00 crc kubenswrapper[4713]: I0127 16:02:00.954021 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w2fhp_ebb9c0b9-bd38-4496-990a-ab1d02cc792b/extract-utilities/0.log" Jan 27 16:02:00 crc kubenswrapper[4713]: I0127 16:02:00.982839 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w2fhp_ebb9c0b9-bd38-4496-990a-ab1d02cc792b/extract-content/0.log" Jan 27 16:02:01 crc kubenswrapper[4713]: I0127 16:02:01.014700 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w2fhp_ebb9c0b9-bd38-4496-990a-ab1d02cc792b/extract-content/0.log" Jan 27 16:02:01 crc kubenswrapper[4713]: I0127 16:02:01.154014 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w2fhp_ebb9c0b9-bd38-4496-990a-ab1d02cc792b/extract-utilities/0.log" Jan 27 16:02:01 crc kubenswrapper[4713]: I0127 16:02:01.174601 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w2fhp_ebb9c0b9-bd38-4496-990a-ab1d02cc792b/extract-content/0.log" Jan 27 16:02:01 crc kubenswrapper[4713]: I0127 16:02:01.400192 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-w2fhp_ebb9c0b9-bd38-4496-990a-ab1d02cc792b/registry-server/0.log" Jan 27 16:02:01 crc kubenswrapper[4713]: I0127 16:02:01.406662 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jclch_9a8987a5-59c3-478b-8b6a-1f2712f6a6f8/extract-utilities/0.log" Jan 27 16:02:01 crc kubenswrapper[4713]: I0127 16:02:01.562791 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jclch_9a8987a5-59c3-478b-8b6a-1f2712f6a6f8/extract-utilities/0.log" Jan 27 16:02:01 crc kubenswrapper[4713]: I0127 16:02:01.565103 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jclch_9a8987a5-59c3-478b-8b6a-1f2712f6a6f8/extract-content/0.log" Jan 27 16:02:01 crc kubenswrapper[4713]: I0127 16:02:01.591137 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jclch_9a8987a5-59c3-478b-8b6a-1f2712f6a6f8/extract-content/0.log" Jan 27 16:02:01 crc kubenswrapper[4713]: I0127 16:02:01.748357 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jclch_9a8987a5-59c3-478b-8b6a-1f2712f6a6f8/extract-utilities/0.log" Jan 27 16:02:01 crc kubenswrapper[4713]: I0127 16:02:01.748400 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jclch_9a8987a5-59c3-478b-8b6a-1f2712f6a6f8/extract-content/0.log" Jan 27 16:02:01 crc kubenswrapper[4713]: I0127 16:02:01.959316 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-t4f7m_4d53c0cc-4a2a-4b48-a2a8-dcf5e854f833/marketplace-operator/0.log" Jan 27 16:02:01 crc kubenswrapper[4713]: I0127 16:02:01.986135 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jclch_9a8987a5-59c3-478b-8b6a-1f2712f6a6f8/registry-server/0.log" Jan 27 16:02:02 crc kubenswrapper[4713]: I0127 16:02:02.022911 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fkchc_dd505900-1549-404e-acfa-948789f8372d/extract-utilities/0.log" Jan 27 16:02:02 crc kubenswrapper[4713]: I0127 16:02:02.198790 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fkchc_dd505900-1549-404e-acfa-948789f8372d/extract-utilities/0.log" Jan 27 16:02:02 crc kubenswrapper[4713]: I0127 16:02:02.225785 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fkchc_dd505900-1549-404e-acfa-948789f8372d/extract-content/0.log" Jan 27 16:02:02 crc kubenswrapper[4713]: I0127 16:02:02.228207 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fkchc_dd505900-1549-404e-acfa-948789f8372d/extract-content/0.log" Jan 27 16:02:02 crc kubenswrapper[4713]: I0127 16:02:02.380859 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fkchc_dd505900-1549-404e-acfa-948789f8372d/extract-utilities/0.log" Jan 27 16:02:02 crc kubenswrapper[4713]: I0127 16:02:02.404163 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fkchc_dd505900-1549-404e-acfa-948789f8372d/extract-content/0.log" Jan 27 16:02:02 crc kubenswrapper[4713]: I0127 16:02:02.547868 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-fkchc_dd505900-1549-404e-acfa-948789f8372d/registry-server/0.log" Jan 27 16:02:12 crc kubenswrapper[4713]: I0127 16:02:12.554933 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:02:12 crc kubenswrapper[4713]: I0127 16:02:12.555758 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:02:14 crc kubenswrapper[4713]: I0127 16:02:14.331252 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7959c84d64-xvkz8_fd5e4f47-aa65-4231-989b-26060dc08c35/prometheus-operator-admission-webhook/0.log" Jan 27 16:02:14 crc kubenswrapper[4713]: I0127 16:02:14.342564 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-admission-webhook-7959c84d64-7ncjn_341a77a0-e923-4418-a9fd-000eb6f5853d/prometheus-operator-admission-webhook/0.log" Jan 27 16:02:14 crc kubenswrapper[4713]: I0127 16:02:14.346221 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_obo-prometheus-operator-68bc856cb9-gjtdw_09bbc2b2-7f95-4927-978a-96873d1737ad/prometheus-operator/0.log" Jan 27 16:02:14 crc kubenswrapper[4713]: I0127 16:02:14.506503 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_observability-operator-59bdc8b94-vfldl_a96f9484-508a-4432-9364-af9abac0a60e/operator/0.log" Jan 27 16:02:14 crc kubenswrapper[4713]: I0127 16:02:14.533488 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operators_perses-operator-5bf474d74f-l9flh_21e51ea9-6185-4ea8-bac3-032447d9da0c/perses-operator/0.log" Jan 27 16:02:42 crc kubenswrapper[4713]: I0127 16:02:42.554978 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:02:42 crc kubenswrapper[4713]: I0127 16:02:42.555734 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:02:52 crc kubenswrapper[4713]: I0127 16:02:52.534681 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kz8pr"] Jan 27 16:02:52 crc kubenswrapper[4713]: I0127 16:02:52.536312 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:02:52 crc kubenswrapper[4713]: I0127 16:02:52.558929 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kz8pr"] Jan 27 16:02:52 crc kubenswrapper[4713]: I0127 16:02:52.689239 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c1576be-7fc8-440a-8980-ef76bd09e2c1-catalog-content\") pod \"community-operators-kz8pr\" (UID: \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\") " pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:02:52 crc kubenswrapper[4713]: I0127 16:02:52.689311 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c9wc\" (UniqueName: \"kubernetes.io/projected/7c1576be-7fc8-440a-8980-ef76bd09e2c1-kube-api-access-2c9wc\") pod \"community-operators-kz8pr\" (UID: \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\") " pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:02:52 crc kubenswrapper[4713]: I0127 16:02:52.689566 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c1576be-7fc8-440a-8980-ef76bd09e2c1-utilities\") pod \"community-operators-kz8pr\" (UID: \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\") " pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:02:52 crc kubenswrapper[4713]: I0127 16:02:52.790833 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c1576be-7fc8-440a-8980-ef76bd09e2c1-utilities\") pod \"community-operators-kz8pr\" (UID: \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\") " pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:02:52 crc kubenswrapper[4713]: I0127 16:02:52.790910 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c1576be-7fc8-440a-8980-ef76bd09e2c1-catalog-content\") pod \"community-operators-kz8pr\" (UID: \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\") " pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:02:52 crc kubenswrapper[4713]: I0127 16:02:52.790945 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c9wc\" (UniqueName: \"kubernetes.io/projected/7c1576be-7fc8-440a-8980-ef76bd09e2c1-kube-api-access-2c9wc\") pod \"community-operators-kz8pr\" (UID: \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\") " pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:02:52 crc kubenswrapper[4713]: I0127 16:02:52.791542 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c1576be-7fc8-440a-8980-ef76bd09e2c1-utilities\") pod \"community-operators-kz8pr\" (UID: \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\") " pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:02:52 crc kubenswrapper[4713]: I0127 16:02:52.791675 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c1576be-7fc8-440a-8980-ef76bd09e2c1-catalog-content\") pod \"community-operators-kz8pr\" (UID: \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\") " pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:02:52 crc kubenswrapper[4713]: I0127 16:02:52.822479 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c9wc\" (UniqueName: \"kubernetes.io/projected/7c1576be-7fc8-440a-8980-ef76bd09e2c1-kube-api-access-2c9wc\") pod \"community-operators-kz8pr\" (UID: \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\") " pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:02:52 crc kubenswrapper[4713]: I0127 16:02:52.858617 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:02:53 crc kubenswrapper[4713]: I0127 16:02:53.165773 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kz8pr"] Jan 27 16:02:53 crc kubenswrapper[4713]: I0127 16:02:53.702746 4713 generic.go:334] "Generic (PLEG): container finished" podID="7c1576be-7fc8-440a-8980-ef76bd09e2c1" containerID="2a74eda155e6b5b9b668e983bdbb9b02fff098a77934ca485178d6834b33b1cb" exitCode=0 Jan 27 16:02:53 crc kubenswrapper[4713]: I0127 16:02:53.702802 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8pr" event={"ID":"7c1576be-7fc8-440a-8980-ef76bd09e2c1","Type":"ContainerDied","Data":"2a74eda155e6b5b9b668e983bdbb9b02fff098a77934ca485178d6834b33b1cb"} Jan 27 16:02:53 crc kubenswrapper[4713]: I0127 16:02:53.702837 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8pr" event={"ID":"7c1576be-7fc8-440a-8980-ef76bd09e2c1","Type":"ContainerStarted","Data":"bc6f46e1119b126b2fe081073ce868ba5fede8f5a242b88eb6c256783cf771e6"} Jan 27 16:02:53 crc kubenswrapper[4713]: I0127 16:02:53.704846 4713 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.109021 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-96bf7"] Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.110617 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.133024 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-96bf7"] Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.230602 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-catalog-content\") pod \"certified-operators-96bf7\" (UID: \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\") " pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.231110 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-utilities\") pod \"certified-operators-96bf7\" (UID: \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\") " pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.231136 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lxh9\" (UniqueName: \"kubernetes.io/projected/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-kube-api-access-2lxh9\") pod \"certified-operators-96bf7\" (UID: \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\") " pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.332685 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-utilities\") pod \"certified-operators-96bf7\" (UID: \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\") " pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.332733 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lxh9\" (UniqueName: \"kubernetes.io/projected/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-kube-api-access-2lxh9\") pod \"certified-operators-96bf7\" (UID: \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\") " pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.332773 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-catalog-content\") pod \"certified-operators-96bf7\" (UID: \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\") " pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.333390 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-catalog-content\") pod \"certified-operators-96bf7\" (UID: \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\") " pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.333414 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-utilities\") pod \"certified-operators-96bf7\" (UID: \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\") " pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.356746 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lxh9\" (UniqueName: \"kubernetes.io/projected/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-kube-api-access-2lxh9\") pod \"certified-operators-96bf7\" (UID: \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\") " pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.434110 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.717065 4713 generic.go:334] "Generic (PLEG): container finished" podID="7c1576be-7fc8-440a-8980-ef76bd09e2c1" containerID="9b36e07bc20a21feddc53403f4e3521624732906b1386b977a2731c4d82f60d9" exitCode=0 Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.717135 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8pr" event={"ID":"7c1576be-7fc8-440a-8980-ef76bd09e2c1","Type":"ContainerDied","Data":"9b36e07bc20a21feddc53403f4e3521624732906b1386b977a2731c4d82f60d9"} Jan 27 16:02:55 crc kubenswrapper[4713]: I0127 16:02:55.900690 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-96bf7"] Jan 27 16:02:55 crc kubenswrapper[4713]: W0127 16:02:55.906961 4713 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55ecc2ef_afbc_40f2_9205_d0dcd7c435af.slice/crio-57ce9e746b2fa1e5f788d85c352ee076df4c497c64f39ad51dde74183e38b432 WatchSource:0}: Error finding container 57ce9e746b2fa1e5f788d85c352ee076df4c497c64f39ad51dde74183e38b432: Status 404 returned error can't find the container with id 57ce9e746b2fa1e5f788d85c352ee076df4c497c64f39ad51dde74183e38b432 Jan 27 16:02:56 crc kubenswrapper[4713]: I0127 16:02:56.729923 4713 generic.go:334] "Generic (PLEG): container finished" podID="55ecc2ef-afbc-40f2-9205-d0dcd7c435af" containerID="17c9f86588c4c79179bc5807bbc74f74ad50f8d79b1b10f60f884e6da468f384" exitCode=0 Jan 27 16:02:56 crc kubenswrapper[4713]: I0127 16:02:56.729987 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96bf7" event={"ID":"55ecc2ef-afbc-40f2-9205-d0dcd7c435af","Type":"ContainerDied","Data":"17c9f86588c4c79179bc5807bbc74f74ad50f8d79b1b10f60f884e6da468f384"} Jan 27 16:02:56 crc kubenswrapper[4713]: I0127 16:02:56.730024 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96bf7" event={"ID":"55ecc2ef-afbc-40f2-9205-d0dcd7c435af","Type":"ContainerStarted","Data":"57ce9e746b2fa1e5f788d85c352ee076df4c497c64f39ad51dde74183e38b432"} Jan 27 16:02:57 crc kubenswrapper[4713]: I0127 16:02:57.738785 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8pr" event={"ID":"7c1576be-7fc8-440a-8980-ef76bd09e2c1","Type":"ContainerStarted","Data":"2b03bfee859459b6eb3528a189da174a9cc793ccabf2e2424cfea3841ee391ee"} Jan 27 16:02:59 crc kubenswrapper[4713]: I0127 16:02:59.754634 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96bf7" event={"ID":"55ecc2ef-afbc-40f2-9205-d0dcd7c435af","Type":"ContainerStarted","Data":"a991131b4029b0f475833a3a70b516d5cddd275deb853b10e92c358674cd012a"} Jan 27 16:02:59 crc kubenswrapper[4713]: I0127 16:02:59.775153 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kz8pr" podStartSLOduration=4.340702054 podStartE2EDuration="7.775127728s" podCreationTimestamp="2026-01-27 16:02:52 +0000 UTC" firstStartedPulling="2026-01-27 16:02:53.704547226 +0000 UTC m=+941.482757164" lastFinishedPulling="2026-01-27 16:02:57.13897286 +0000 UTC m=+944.917182838" observedRunningTime="2026-01-27 16:02:57.760107883 +0000 UTC m=+945.538317821" watchObservedRunningTime="2026-01-27 16:02:59.775127728 +0000 UTC m=+947.553337666" Jan 27 16:03:00 crc kubenswrapper[4713]: I0127 16:03:00.761761 4713 generic.go:334] "Generic (PLEG): container finished" podID="55ecc2ef-afbc-40f2-9205-d0dcd7c435af" containerID="a991131b4029b0f475833a3a70b516d5cddd275deb853b10e92c358674cd012a" exitCode=0 Jan 27 16:03:00 crc kubenswrapper[4713]: I0127 16:03:00.761830 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96bf7" event={"ID":"55ecc2ef-afbc-40f2-9205-d0dcd7c435af","Type":"ContainerDied","Data":"a991131b4029b0f475833a3a70b516d5cddd275deb853b10e92c358674cd012a"} Jan 27 16:03:01 crc kubenswrapper[4713]: I0127 16:03:01.769746 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96bf7" event={"ID":"55ecc2ef-afbc-40f2-9205-d0dcd7c435af","Type":"ContainerStarted","Data":"35d990e6316d3664672b67bfa837f07f04345f9b51c2e6de8b7fa45771737b69"} Jan 27 16:03:01 crc kubenswrapper[4713]: I0127 16:03:01.795110 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-96bf7" podStartSLOduration=2.411910652 podStartE2EDuration="6.795086696s" podCreationTimestamp="2026-01-27 16:02:55 +0000 UTC" firstStartedPulling="2026-01-27 16:02:56.814823252 +0000 UTC m=+944.593033190" lastFinishedPulling="2026-01-27 16:03:01.197999296 +0000 UTC m=+948.976209234" observedRunningTime="2026-01-27 16:03:01.79173534 +0000 UTC m=+949.569945288" watchObservedRunningTime="2026-01-27 16:03:01.795086696 +0000 UTC m=+949.573296634" Jan 27 16:03:02 crc kubenswrapper[4713]: I0127 16:03:02.859485 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:03:02 crc kubenswrapper[4713]: I0127 16:03:02.859583 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:03:02 crc kubenswrapper[4713]: I0127 16:03:02.908477 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:03:03 crc kubenswrapper[4713]: I0127 16:03:03.829015 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:03:04 crc kubenswrapper[4713]: I0127 16:03:04.103926 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kz8pr"] Jan 27 16:03:05 crc kubenswrapper[4713]: I0127 16:03:05.434415 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:03:05 crc kubenswrapper[4713]: I0127 16:03:05.435903 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:03:05 crc kubenswrapper[4713]: I0127 16:03:05.482673 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:03:05 crc kubenswrapper[4713]: I0127 16:03:05.797949 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kz8pr" podUID="7c1576be-7fc8-440a-8980-ef76bd09e2c1" containerName="registry-server" containerID="cri-o://2b03bfee859459b6eb3528a189da174a9cc793ccabf2e2424cfea3841ee391ee" gracePeriod=2 Jan 27 16:03:06 crc kubenswrapper[4713]: I0127 16:03:06.819391 4713 generic.go:334] "Generic (PLEG): container finished" podID="7c1576be-7fc8-440a-8980-ef76bd09e2c1" containerID="2b03bfee859459b6eb3528a189da174a9cc793ccabf2e2424cfea3841ee391ee" exitCode=0 Jan 27 16:03:06 crc kubenswrapper[4713]: I0127 16:03:06.820459 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8pr" event={"ID":"7c1576be-7fc8-440a-8980-ef76bd09e2c1","Type":"ContainerDied","Data":"2b03bfee859459b6eb3528a189da174a9cc793ccabf2e2424cfea3841ee391ee"} Jan 27 16:03:06 crc kubenswrapper[4713]: I0127 16:03:06.870116 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.361768 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.415683 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c1576be-7fc8-440a-8980-ef76bd09e2c1-catalog-content\") pod \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\" (UID: \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\") " Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.416226 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c1576be-7fc8-440a-8980-ef76bd09e2c1-utilities\") pod \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\" (UID: \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\") " Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.416298 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2c9wc\" (UniqueName: \"kubernetes.io/projected/7c1576be-7fc8-440a-8980-ef76bd09e2c1-kube-api-access-2c9wc\") pod \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\" (UID: \"7c1576be-7fc8-440a-8980-ef76bd09e2c1\") " Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.417168 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c1576be-7fc8-440a-8980-ef76bd09e2c1-utilities" (OuterVolumeSpecName: "utilities") pod "7c1576be-7fc8-440a-8980-ef76bd09e2c1" (UID: "7c1576be-7fc8-440a-8980-ef76bd09e2c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.432469 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c1576be-7fc8-440a-8980-ef76bd09e2c1-kube-api-access-2c9wc" (OuterVolumeSpecName: "kube-api-access-2c9wc") pod "7c1576be-7fc8-440a-8980-ef76bd09e2c1" (UID: "7c1576be-7fc8-440a-8980-ef76bd09e2c1"). InnerVolumeSpecName "kube-api-access-2c9wc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.470726 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c1576be-7fc8-440a-8980-ef76bd09e2c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7c1576be-7fc8-440a-8980-ef76bd09e2c1" (UID: "7c1576be-7fc8-440a-8980-ef76bd09e2c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.501696 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-96bf7"] Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.517713 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7c1576be-7fc8-440a-8980-ef76bd09e2c1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.517762 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7c1576be-7fc8-440a-8980-ef76bd09e2c1-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.517772 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2c9wc\" (UniqueName: \"kubernetes.io/projected/7c1576be-7fc8-440a-8980-ef76bd09e2c1-kube-api-access-2c9wc\") on node \"crc\" DevicePath \"\"" Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.844106 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kz8pr" Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.846027 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kz8pr" event={"ID":"7c1576be-7fc8-440a-8980-ef76bd09e2c1","Type":"ContainerDied","Data":"bc6f46e1119b126b2fe081073ce868ba5fede8f5a242b88eb6c256783cf771e6"} Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.846142 4713 scope.go:117] "RemoveContainer" containerID="2b03bfee859459b6eb3528a189da174a9cc793ccabf2e2424cfea3841ee391ee" Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.870900 4713 scope.go:117] "RemoveContainer" containerID="9b36e07bc20a21feddc53403f4e3521624732906b1386b977a2731c4d82f60d9" Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.877609 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kz8pr"] Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.882827 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kz8pr"] Jan 27 16:03:07 crc kubenswrapper[4713]: I0127 16:03:07.893202 4713 scope.go:117] "RemoveContainer" containerID="2a74eda155e6b5b9b668e983bdbb9b02fff098a77934ca485178d6834b33b1cb" Jan 27 16:03:08 crc kubenswrapper[4713]: I0127 16:03:08.851499 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-96bf7" podUID="55ecc2ef-afbc-40f2-9205-d0dcd7c435af" containerName="registry-server" containerID="cri-o://35d990e6316d3664672b67bfa837f07f04345f9b51c2e6de8b7fa45771737b69" gracePeriod=2 Jan 27 16:03:08 crc kubenswrapper[4713]: I0127 16:03:08.925649 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c1576be-7fc8-440a-8980-ef76bd09e2c1" path="/var/lib/kubelet/pods/7c1576be-7fc8-440a-8980-ef76bd09e2c1/volumes" Jan 27 16:03:09 crc kubenswrapper[4713]: I0127 16:03:09.867778 4713 generic.go:334] "Generic (PLEG): container finished" podID="55ecc2ef-afbc-40f2-9205-d0dcd7c435af" containerID="35d990e6316d3664672b67bfa837f07f04345f9b51c2e6de8b7fa45771737b69" exitCode=0 Jan 27 16:03:09 crc kubenswrapper[4713]: I0127 16:03:09.867868 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96bf7" event={"ID":"55ecc2ef-afbc-40f2-9205-d0dcd7c435af","Type":"ContainerDied","Data":"35d990e6316d3664672b67bfa837f07f04345f9b51c2e6de8b7fa45771737b69"} Jan 27 16:03:09 crc kubenswrapper[4713]: I0127 16:03:09.867915 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-96bf7" event={"ID":"55ecc2ef-afbc-40f2-9205-d0dcd7c435af","Type":"ContainerDied","Data":"57ce9e746b2fa1e5f788d85c352ee076df4c497c64f39ad51dde74183e38b432"} Jan 27 16:03:09 crc kubenswrapper[4713]: I0127 16:03:09.867938 4713 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="57ce9e746b2fa1e5f788d85c352ee076df4c497c64f39ad51dde74183e38b432" Jan 27 16:03:09 crc kubenswrapper[4713]: I0127 16:03:09.872973 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:03:09 crc kubenswrapper[4713]: I0127 16:03:09.952972 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lxh9\" (UniqueName: \"kubernetes.io/projected/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-kube-api-access-2lxh9\") pod \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\" (UID: \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\") " Jan 27 16:03:09 crc kubenswrapper[4713]: I0127 16:03:09.953174 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-catalog-content\") pod \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\" (UID: \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\") " Jan 27 16:03:09 crc kubenswrapper[4713]: I0127 16:03:09.953336 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-utilities\") pod \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\" (UID: \"55ecc2ef-afbc-40f2-9205-d0dcd7c435af\") " Jan 27 16:03:09 crc kubenswrapper[4713]: I0127 16:03:09.954891 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-utilities" (OuterVolumeSpecName: "utilities") pod "55ecc2ef-afbc-40f2-9205-d0dcd7c435af" (UID: "55ecc2ef-afbc-40f2-9205-d0dcd7c435af"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:03:09 crc kubenswrapper[4713]: I0127 16:03:09.961674 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-kube-api-access-2lxh9" (OuterVolumeSpecName: "kube-api-access-2lxh9") pod "55ecc2ef-afbc-40f2-9205-d0dcd7c435af" (UID: "55ecc2ef-afbc-40f2-9205-d0dcd7c435af"). InnerVolumeSpecName "kube-api-access-2lxh9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:03:10 crc kubenswrapper[4713]: I0127 16:03:10.005677 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "55ecc2ef-afbc-40f2-9205-d0dcd7c435af" (UID: "55ecc2ef-afbc-40f2-9205-d0dcd7c435af"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:03:10 crc kubenswrapper[4713]: I0127 16:03:10.055098 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lxh9\" (UniqueName: \"kubernetes.io/projected/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-kube-api-access-2lxh9\") on node \"crc\" DevicePath \"\"" Jan 27 16:03:10 crc kubenswrapper[4713]: I0127 16:03:10.055155 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:03:10 crc kubenswrapper[4713]: I0127 16:03:10.055174 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/55ecc2ef-afbc-40f2-9205-d0dcd7c435af-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:03:10 crc kubenswrapper[4713]: I0127 16:03:10.873063 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-96bf7" Jan 27 16:03:10 crc kubenswrapper[4713]: I0127 16:03:10.911872 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-96bf7"] Jan 27 16:03:10 crc kubenswrapper[4713]: I0127 16:03:10.926516 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-96bf7"] Jan 27 16:03:12 crc kubenswrapper[4713]: I0127 16:03:12.554627 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:03:12 crc kubenswrapper[4713]: I0127 16:03:12.554697 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:03:12 crc kubenswrapper[4713]: I0127 16:03:12.554749 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 16:03:12 crc kubenswrapper[4713]: I0127 16:03:12.555264 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"560d0585a56a5323a4b025a75a383c0d30c7c995c13b04b0d45b84883ee39f33"} pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 16:03:12 crc kubenswrapper[4713]: I0127 16:03:12.555312 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" containerID="cri-o://560d0585a56a5323a4b025a75a383c0d30c7c995c13b04b0d45b84883ee39f33" gracePeriod=600 Jan 27 16:03:12 crc kubenswrapper[4713]: I0127 16:03:12.886302 4713 generic.go:334] "Generic (PLEG): container finished" podID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerID="560d0585a56a5323a4b025a75a383c0d30c7c995c13b04b0d45b84883ee39f33" exitCode=0 Jan 27 16:03:12 crc kubenswrapper[4713]: I0127 16:03:12.886363 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerDied","Data":"560d0585a56a5323a4b025a75a383c0d30c7c995c13b04b0d45b84883ee39f33"} Jan 27 16:03:12 crc kubenswrapper[4713]: I0127 16:03:12.886707 4713 scope.go:117] "RemoveContainer" containerID="ce9cf5c90b2ef4b5d8bbf232cb22333413a90aeb960b0273756421fdaf75fb8b" Jan 27 16:03:12 crc kubenswrapper[4713]: I0127 16:03:12.913968 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55ecc2ef-afbc-40f2-9205-d0dcd7c435af" path="/var/lib/kubelet/pods/55ecc2ef-afbc-40f2-9205-d0dcd7c435af/volumes" Jan 27 16:03:13 crc kubenswrapper[4713]: I0127 16:03:13.895686 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerStarted","Data":"5a4605fe3c7a7fcac1b377a32c190a69b3f7ac27a6fa7dfc407ba60f17ec75a8"} Jan 27 16:03:14 crc kubenswrapper[4713]: I0127 16:03:14.924481 4713 generic.go:334] "Generic (PLEG): container finished" podID="414acf6a-9608-469d-a042-ac94bc172381" containerID="5d80504e0a0826e85ad9319fe515d07042ae20fc6a4d468e0a76f97bb4bca393" exitCode=0 Jan 27 16:03:14 crc kubenswrapper[4713]: I0127 16:03:14.924666 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-d6f95/must-gather-t4stn" event={"ID":"414acf6a-9608-469d-a042-ac94bc172381","Type":"ContainerDied","Data":"5d80504e0a0826e85ad9319fe515d07042ae20fc6a4d468e0a76f97bb4bca393"} Jan 27 16:03:14 crc kubenswrapper[4713]: I0127 16:03:14.925541 4713 scope.go:117] "RemoveContainer" containerID="5d80504e0a0826e85ad9319fe515d07042ae20fc6a4d468e0a76f97bb4bca393" Jan 27 16:03:15 crc kubenswrapper[4713]: I0127 16:03:15.308378 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6f95_must-gather-t4stn_414acf6a-9608-469d-a042-ac94bc172381/gather/0.log" Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.346693 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-d6f95/must-gather-t4stn"] Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.347835 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-d6f95/must-gather-t4stn" podUID="414acf6a-9608-469d-a042-ac94bc172381" containerName="copy" containerID="cri-o://bde0fbdfa373f3d56d4442cedcd0e711596e3719d5e4e282e403c7000a16d9dc" gracePeriod=2 Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.350988 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-d6f95/must-gather-t4stn"] Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.684701 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6f95_must-gather-t4stn_414acf6a-9608-469d-a042-ac94bc172381/copy/0.log" Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.685539 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6f95/must-gather-t4stn" Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.760617 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/414acf6a-9608-469d-a042-ac94bc172381-must-gather-output\") pod \"414acf6a-9608-469d-a042-ac94bc172381\" (UID: \"414acf6a-9608-469d-a042-ac94bc172381\") " Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.760749 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klvgf\" (UniqueName: \"kubernetes.io/projected/414acf6a-9608-469d-a042-ac94bc172381-kube-api-access-klvgf\") pod \"414acf6a-9608-469d-a042-ac94bc172381\" (UID: \"414acf6a-9608-469d-a042-ac94bc172381\") " Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.769733 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/414acf6a-9608-469d-a042-ac94bc172381-kube-api-access-klvgf" (OuterVolumeSpecName: "kube-api-access-klvgf") pod "414acf6a-9608-469d-a042-ac94bc172381" (UID: "414acf6a-9608-469d-a042-ac94bc172381"). InnerVolumeSpecName "kube-api-access-klvgf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.814783 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/414acf6a-9608-469d-a042-ac94bc172381-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "414acf6a-9608-469d-a042-ac94bc172381" (UID: "414acf6a-9608-469d-a042-ac94bc172381"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.862794 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klvgf\" (UniqueName: \"kubernetes.io/projected/414acf6a-9608-469d-a042-ac94bc172381-kube-api-access-klvgf\") on node \"crc\" DevicePath \"\"" Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.862842 4713 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/414acf6a-9608-469d-a042-ac94bc172381-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.906395 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="414acf6a-9608-469d-a042-ac94bc172381" path="/var/lib/kubelet/pods/414acf6a-9608-469d-a042-ac94bc172381/volumes" Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.977412 4713 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-d6f95_must-gather-t4stn_414acf6a-9608-469d-a042-ac94bc172381/copy/0.log" Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.977839 4713 generic.go:334] "Generic (PLEG): container finished" podID="414acf6a-9608-469d-a042-ac94bc172381" containerID="bde0fbdfa373f3d56d4442cedcd0e711596e3719d5e4e282e403c7000a16d9dc" exitCode=143 Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.977890 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-d6f95/must-gather-t4stn" Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.977987 4713 scope.go:117] "RemoveContainer" containerID="bde0fbdfa373f3d56d4442cedcd0e711596e3719d5e4e282e403c7000a16d9dc" Jan 27 16:03:22 crc kubenswrapper[4713]: I0127 16:03:22.997741 4713 scope.go:117] "RemoveContainer" containerID="5d80504e0a0826e85ad9319fe515d07042ae20fc6a4d468e0a76f97bb4bca393" Jan 27 16:03:23 crc kubenswrapper[4713]: I0127 16:03:23.037771 4713 scope.go:117] "RemoveContainer" containerID="bde0fbdfa373f3d56d4442cedcd0e711596e3719d5e4e282e403c7000a16d9dc" Jan 27 16:03:23 crc kubenswrapper[4713]: E0127 16:03:23.038371 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bde0fbdfa373f3d56d4442cedcd0e711596e3719d5e4e282e403c7000a16d9dc\": container with ID starting with bde0fbdfa373f3d56d4442cedcd0e711596e3719d5e4e282e403c7000a16d9dc not found: ID does not exist" containerID="bde0fbdfa373f3d56d4442cedcd0e711596e3719d5e4e282e403c7000a16d9dc" Jan 27 16:03:23 crc kubenswrapper[4713]: I0127 16:03:23.038494 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bde0fbdfa373f3d56d4442cedcd0e711596e3719d5e4e282e403c7000a16d9dc"} err="failed to get container status \"bde0fbdfa373f3d56d4442cedcd0e711596e3719d5e4e282e403c7000a16d9dc\": rpc error: code = NotFound desc = could not find container \"bde0fbdfa373f3d56d4442cedcd0e711596e3719d5e4e282e403c7000a16d9dc\": container with ID starting with bde0fbdfa373f3d56d4442cedcd0e711596e3719d5e4e282e403c7000a16d9dc not found: ID does not exist" Jan 27 16:03:23 crc kubenswrapper[4713]: I0127 16:03:23.038596 4713 scope.go:117] "RemoveContainer" containerID="5d80504e0a0826e85ad9319fe515d07042ae20fc6a4d468e0a76f97bb4bca393" Jan 27 16:03:23 crc kubenswrapper[4713]: E0127 16:03:23.038992 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d80504e0a0826e85ad9319fe515d07042ae20fc6a4d468e0a76f97bb4bca393\": container with ID starting with 5d80504e0a0826e85ad9319fe515d07042ae20fc6a4d468e0a76f97bb4bca393 not found: ID does not exist" containerID="5d80504e0a0826e85ad9319fe515d07042ae20fc6a4d468e0a76f97bb4bca393" Jan 27 16:03:23 crc kubenswrapper[4713]: I0127 16:03:23.039098 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d80504e0a0826e85ad9319fe515d07042ae20fc6a4d468e0a76f97bb4bca393"} err="failed to get container status \"5d80504e0a0826e85ad9319fe515d07042ae20fc6a4d468e0a76f97bb4bca393\": rpc error: code = NotFound desc = could not find container \"5d80504e0a0826e85ad9319fe515d07042ae20fc6a4d468e0a76f97bb4bca393\": container with ID starting with 5d80504e0a0826e85ad9319fe515d07042ae20fc6a4d468e0a76f97bb4bca393 not found: ID does not exist" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.340182 4713 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9774r"] Jan 27 16:04:04 crc kubenswrapper[4713]: E0127 16:04:04.341196 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ecc2ef-afbc-40f2-9205-d0dcd7c435af" containerName="extract-utilities" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.341213 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ecc2ef-afbc-40f2-9205-d0dcd7c435af" containerName="extract-utilities" Jan 27 16:04:04 crc kubenswrapper[4713]: E0127 16:04:04.341231 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414acf6a-9608-469d-a042-ac94bc172381" containerName="copy" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.341238 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="414acf6a-9608-469d-a042-ac94bc172381" containerName="copy" Jan 27 16:04:04 crc kubenswrapper[4713]: E0127 16:04:04.341249 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ecc2ef-afbc-40f2-9205-d0dcd7c435af" containerName="extract-content" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.341257 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ecc2ef-afbc-40f2-9205-d0dcd7c435af" containerName="extract-content" Jan 27 16:04:04 crc kubenswrapper[4713]: E0127 16:04:04.341271 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414acf6a-9608-469d-a042-ac94bc172381" containerName="gather" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.341278 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="414acf6a-9608-469d-a042-ac94bc172381" containerName="gather" Jan 27 16:04:04 crc kubenswrapper[4713]: E0127 16:04:04.341291 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c1576be-7fc8-440a-8980-ef76bd09e2c1" containerName="registry-server" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.341299 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c1576be-7fc8-440a-8980-ef76bd09e2c1" containerName="registry-server" Jan 27 16:04:04 crc kubenswrapper[4713]: E0127 16:04:04.341314 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c1576be-7fc8-440a-8980-ef76bd09e2c1" containerName="extract-utilities" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.341327 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c1576be-7fc8-440a-8980-ef76bd09e2c1" containerName="extract-utilities" Jan 27 16:04:04 crc kubenswrapper[4713]: E0127 16:04:04.341338 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c1576be-7fc8-440a-8980-ef76bd09e2c1" containerName="extract-content" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.341347 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c1576be-7fc8-440a-8980-ef76bd09e2c1" containerName="extract-content" Jan 27 16:04:04 crc kubenswrapper[4713]: E0127 16:04:04.341360 4713 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55ecc2ef-afbc-40f2-9205-d0dcd7c435af" containerName="registry-server" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.341370 4713 state_mem.go:107] "Deleted CPUSet assignment" podUID="55ecc2ef-afbc-40f2-9205-d0dcd7c435af" containerName="registry-server" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.341540 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c1576be-7fc8-440a-8980-ef76bd09e2c1" containerName="registry-server" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.341557 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="55ecc2ef-afbc-40f2-9205-d0dcd7c435af" containerName="registry-server" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.341569 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="414acf6a-9608-469d-a042-ac94bc172381" containerName="copy" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.341589 4713 memory_manager.go:354] "RemoveStaleState removing state" podUID="414acf6a-9608-469d-a042-ac94bc172381" containerName="gather" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.342870 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.353875 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9774r"] Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.437819 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d01755f6-986e-43e6-9adc-8ac84624e362-utilities\") pod \"redhat-operators-9774r\" (UID: \"d01755f6-986e-43e6-9adc-8ac84624e362\") " pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.437950 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d01755f6-986e-43e6-9adc-8ac84624e362-catalog-content\") pod \"redhat-operators-9774r\" (UID: \"d01755f6-986e-43e6-9adc-8ac84624e362\") " pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.437987 4713 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv2fk\" (UniqueName: \"kubernetes.io/projected/d01755f6-986e-43e6-9adc-8ac84624e362-kube-api-access-sv2fk\") pod \"redhat-operators-9774r\" (UID: \"d01755f6-986e-43e6-9adc-8ac84624e362\") " pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.539863 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d01755f6-986e-43e6-9adc-8ac84624e362-utilities\") pod \"redhat-operators-9774r\" (UID: \"d01755f6-986e-43e6-9adc-8ac84624e362\") " pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.540185 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d01755f6-986e-43e6-9adc-8ac84624e362-catalog-content\") pod \"redhat-operators-9774r\" (UID: \"d01755f6-986e-43e6-9adc-8ac84624e362\") " pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.540219 4713 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv2fk\" (UniqueName: \"kubernetes.io/projected/d01755f6-986e-43e6-9adc-8ac84624e362-kube-api-access-sv2fk\") pod \"redhat-operators-9774r\" (UID: \"d01755f6-986e-43e6-9adc-8ac84624e362\") " pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.540823 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d01755f6-986e-43e6-9adc-8ac84624e362-utilities\") pod \"redhat-operators-9774r\" (UID: \"d01755f6-986e-43e6-9adc-8ac84624e362\") " pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.540833 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d01755f6-986e-43e6-9adc-8ac84624e362-catalog-content\") pod \"redhat-operators-9774r\" (UID: \"d01755f6-986e-43e6-9adc-8ac84624e362\") " pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.564135 4713 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv2fk\" (UniqueName: \"kubernetes.io/projected/d01755f6-986e-43e6-9adc-8ac84624e362-kube-api-access-sv2fk\") pod \"redhat-operators-9774r\" (UID: \"d01755f6-986e-43e6-9adc-8ac84624e362\") " pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.671398 4713 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:04 crc kubenswrapper[4713]: I0127 16:04:04.922057 4713 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9774r"] Jan 27 16:04:05 crc kubenswrapper[4713]: I0127 16:04:05.309156 4713 generic.go:334] "Generic (PLEG): container finished" podID="d01755f6-986e-43e6-9adc-8ac84624e362" containerID="e48929fb08c08c8f0f681a666b81f0b21d25b1533bcce15b49e34bee0d751a69" exitCode=0 Jan 27 16:04:05 crc kubenswrapper[4713]: I0127 16:04:05.309209 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9774r" event={"ID":"d01755f6-986e-43e6-9adc-8ac84624e362","Type":"ContainerDied","Data":"e48929fb08c08c8f0f681a666b81f0b21d25b1533bcce15b49e34bee0d751a69"} Jan 27 16:04:05 crc kubenswrapper[4713]: I0127 16:04:05.309616 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9774r" event={"ID":"d01755f6-986e-43e6-9adc-8ac84624e362","Type":"ContainerStarted","Data":"a4926007b33c21d16ee4ac5aa68ab8b200ce60f8e56a63dfd70494db28e95101"} Jan 27 16:04:06 crc kubenswrapper[4713]: I0127 16:04:06.326686 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9774r" event={"ID":"d01755f6-986e-43e6-9adc-8ac84624e362","Type":"ContainerStarted","Data":"8206d55f9730ec75179743300ac1460cf502c797417974103759afd1b32cebc0"} Jan 27 16:04:07 crc kubenswrapper[4713]: I0127 16:04:07.335495 4713 generic.go:334] "Generic (PLEG): container finished" podID="d01755f6-986e-43e6-9adc-8ac84624e362" containerID="8206d55f9730ec75179743300ac1460cf502c797417974103759afd1b32cebc0" exitCode=0 Jan 27 16:04:07 crc kubenswrapper[4713]: I0127 16:04:07.335559 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9774r" event={"ID":"d01755f6-986e-43e6-9adc-8ac84624e362","Type":"ContainerDied","Data":"8206d55f9730ec75179743300ac1460cf502c797417974103759afd1b32cebc0"} Jan 27 16:04:08 crc kubenswrapper[4713]: I0127 16:04:08.346573 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9774r" event={"ID":"d01755f6-986e-43e6-9adc-8ac84624e362","Type":"ContainerStarted","Data":"a54026ca0bf023f86ff8df925d2e9cde24da359ea6caabc9b5812480160250f0"} Jan 27 16:04:08 crc kubenswrapper[4713]: I0127 16:04:08.371947 4713 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9774r" podStartSLOduration=1.655878081 podStartE2EDuration="4.371913786s" podCreationTimestamp="2026-01-27 16:04:04 +0000 UTC" firstStartedPulling="2026-01-27 16:04:05.310652835 +0000 UTC m=+1013.088862773" lastFinishedPulling="2026-01-27 16:04:08.02668854 +0000 UTC m=+1015.804898478" observedRunningTime="2026-01-27 16:04:08.366516421 +0000 UTC m=+1016.144726359" watchObservedRunningTime="2026-01-27 16:04:08.371913786 +0000 UTC m=+1016.150123744" Jan 27 16:04:14 crc kubenswrapper[4713]: I0127 16:04:14.672208 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:14 crc kubenswrapper[4713]: I0127 16:04:14.672690 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:14 crc kubenswrapper[4713]: I0127 16:04:14.728069 4713 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:15 crc kubenswrapper[4713]: I0127 16:04:15.449548 4713 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:15 crc kubenswrapper[4713]: I0127 16:04:15.514564 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9774r"] Jan 27 16:04:17 crc kubenswrapper[4713]: I0127 16:04:17.417199 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9774r" podUID="d01755f6-986e-43e6-9adc-8ac84624e362" containerName="registry-server" containerID="cri-o://a54026ca0bf023f86ff8df925d2e9cde24da359ea6caabc9b5812480160250f0" gracePeriod=2 Jan 27 16:04:17 crc kubenswrapper[4713]: I0127 16:04:17.802307 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:17 crc kubenswrapper[4713]: I0127 16:04:17.935117 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d01755f6-986e-43e6-9adc-8ac84624e362-utilities\") pod \"d01755f6-986e-43e6-9adc-8ac84624e362\" (UID: \"d01755f6-986e-43e6-9adc-8ac84624e362\") " Jan 27 16:04:17 crc kubenswrapper[4713]: I0127 16:04:17.935190 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv2fk\" (UniqueName: \"kubernetes.io/projected/d01755f6-986e-43e6-9adc-8ac84624e362-kube-api-access-sv2fk\") pod \"d01755f6-986e-43e6-9adc-8ac84624e362\" (UID: \"d01755f6-986e-43e6-9adc-8ac84624e362\") " Jan 27 16:04:17 crc kubenswrapper[4713]: I0127 16:04:17.935277 4713 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d01755f6-986e-43e6-9adc-8ac84624e362-catalog-content\") pod \"d01755f6-986e-43e6-9adc-8ac84624e362\" (UID: \"d01755f6-986e-43e6-9adc-8ac84624e362\") " Jan 27 16:04:17 crc kubenswrapper[4713]: I0127 16:04:17.937862 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d01755f6-986e-43e6-9adc-8ac84624e362-utilities" (OuterVolumeSpecName: "utilities") pod "d01755f6-986e-43e6-9adc-8ac84624e362" (UID: "d01755f6-986e-43e6-9adc-8ac84624e362"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:04:17 crc kubenswrapper[4713]: I0127 16:04:17.948342 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d01755f6-986e-43e6-9adc-8ac84624e362-kube-api-access-sv2fk" (OuterVolumeSpecName: "kube-api-access-sv2fk") pod "d01755f6-986e-43e6-9adc-8ac84624e362" (UID: "d01755f6-986e-43e6-9adc-8ac84624e362"). InnerVolumeSpecName "kube-api-access-sv2fk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.036849 4713 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d01755f6-986e-43e6-9adc-8ac84624e362-utilities\") on node \"crc\" DevicePath \"\"" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.036895 4713 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv2fk\" (UniqueName: \"kubernetes.io/projected/d01755f6-986e-43e6-9adc-8ac84624e362-kube-api-access-sv2fk\") on node \"crc\" DevicePath \"\"" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.048069 4713 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d01755f6-986e-43e6-9adc-8ac84624e362-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d01755f6-986e-43e6-9adc-8ac84624e362" (UID: "d01755f6-986e-43e6-9adc-8ac84624e362"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.138357 4713 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d01755f6-986e-43e6-9adc-8ac84624e362-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.432698 4713 generic.go:334] "Generic (PLEG): container finished" podID="d01755f6-986e-43e6-9adc-8ac84624e362" containerID="a54026ca0bf023f86ff8df925d2e9cde24da359ea6caabc9b5812480160250f0" exitCode=0 Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.432763 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9774r" event={"ID":"d01755f6-986e-43e6-9adc-8ac84624e362","Type":"ContainerDied","Data":"a54026ca0bf023f86ff8df925d2e9cde24da359ea6caabc9b5812480160250f0"} Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.432809 4713 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9774r" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.432831 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9774r" event={"ID":"d01755f6-986e-43e6-9adc-8ac84624e362","Type":"ContainerDied","Data":"a4926007b33c21d16ee4ac5aa68ab8b200ce60f8e56a63dfd70494db28e95101"} Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.432866 4713 scope.go:117] "RemoveContainer" containerID="a54026ca0bf023f86ff8df925d2e9cde24da359ea6caabc9b5812480160250f0" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.466299 4713 scope.go:117] "RemoveContainer" containerID="8206d55f9730ec75179743300ac1460cf502c797417974103759afd1b32cebc0" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.490329 4713 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9774r"] Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.497971 4713 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9774r"] Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.506292 4713 scope.go:117] "RemoveContainer" containerID="e48929fb08c08c8f0f681a666b81f0b21d25b1533bcce15b49e34bee0d751a69" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.530417 4713 scope.go:117] "RemoveContainer" containerID="a54026ca0bf023f86ff8df925d2e9cde24da359ea6caabc9b5812480160250f0" Jan 27 16:04:18 crc kubenswrapper[4713]: E0127 16:04:18.531110 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a54026ca0bf023f86ff8df925d2e9cde24da359ea6caabc9b5812480160250f0\": container with ID starting with a54026ca0bf023f86ff8df925d2e9cde24da359ea6caabc9b5812480160250f0 not found: ID does not exist" containerID="a54026ca0bf023f86ff8df925d2e9cde24da359ea6caabc9b5812480160250f0" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.531157 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a54026ca0bf023f86ff8df925d2e9cde24da359ea6caabc9b5812480160250f0"} err="failed to get container status \"a54026ca0bf023f86ff8df925d2e9cde24da359ea6caabc9b5812480160250f0\": rpc error: code = NotFound desc = could not find container \"a54026ca0bf023f86ff8df925d2e9cde24da359ea6caabc9b5812480160250f0\": container with ID starting with a54026ca0bf023f86ff8df925d2e9cde24da359ea6caabc9b5812480160250f0 not found: ID does not exist" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.531186 4713 scope.go:117] "RemoveContainer" containerID="8206d55f9730ec75179743300ac1460cf502c797417974103759afd1b32cebc0" Jan 27 16:04:18 crc kubenswrapper[4713]: E0127 16:04:18.532169 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8206d55f9730ec75179743300ac1460cf502c797417974103759afd1b32cebc0\": container with ID starting with 8206d55f9730ec75179743300ac1460cf502c797417974103759afd1b32cebc0 not found: ID does not exist" containerID="8206d55f9730ec75179743300ac1460cf502c797417974103759afd1b32cebc0" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.532303 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8206d55f9730ec75179743300ac1460cf502c797417974103759afd1b32cebc0"} err="failed to get container status \"8206d55f9730ec75179743300ac1460cf502c797417974103759afd1b32cebc0\": rpc error: code = NotFound desc = could not find container \"8206d55f9730ec75179743300ac1460cf502c797417974103759afd1b32cebc0\": container with ID starting with 8206d55f9730ec75179743300ac1460cf502c797417974103759afd1b32cebc0 not found: ID does not exist" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.532354 4713 scope.go:117] "RemoveContainer" containerID="e48929fb08c08c8f0f681a666b81f0b21d25b1533bcce15b49e34bee0d751a69" Jan 27 16:04:18 crc kubenswrapper[4713]: E0127 16:04:18.532921 4713 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e48929fb08c08c8f0f681a666b81f0b21d25b1533bcce15b49e34bee0d751a69\": container with ID starting with e48929fb08c08c8f0f681a666b81f0b21d25b1533bcce15b49e34bee0d751a69 not found: ID does not exist" containerID="e48929fb08c08c8f0f681a666b81f0b21d25b1533bcce15b49e34bee0d751a69" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.533116 4713 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e48929fb08c08c8f0f681a666b81f0b21d25b1533bcce15b49e34bee0d751a69"} err="failed to get container status \"e48929fb08c08c8f0f681a666b81f0b21d25b1533bcce15b49e34bee0d751a69\": rpc error: code = NotFound desc = could not find container \"e48929fb08c08c8f0f681a666b81f0b21d25b1533bcce15b49e34bee0d751a69\": container with ID starting with e48929fb08c08c8f0f681a666b81f0b21d25b1533bcce15b49e34bee0d751a69 not found: ID does not exist" Jan 27 16:04:18 crc kubenswrapper[4713]: I0127 16:04:18.907743 4713 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d01755f6-986e-43e6-9adc-8ac84624e362" path="/var/lib/kubelet/pods/d01755f6-986e-43e6-9adc-8ac84624e362/volumes" Jan 27 16:05:12 crc kubenswrapper[4713]: I0127 16:05:12.556500 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:05:12 crc kubenswrapper[4713]: I0127 16:05:12.557563 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:05:42 crc kubenswrapper[4713]: I0127 16:05:42.555220 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:05:42 crc kubenswrapper[4713]: I0127 16:05:42.556178 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:06:12 crc kubenswrapper[4713]: I0127 16:06:12.555587 4713 patch_prober.go:28] interesting pod/machine-config-daemon-6h5wz container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 27 16:06:12 crc kubenswrapper[4713]: I0127 16:06:12.556486 4713 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 27 16:06:12 crc kubenswrapper[4713]: I0127 16:06:12.556557 4713 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" Jan 27 16:06:12 crc kubenswrapper[4713]: I0127 16:06:12.557439 4713 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"5a4605fe3c7a7fcac1b377a32c190a69b3f7ac27a6fa7dfc407ba60f17ec75a8"} pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 27 16:06:12 crc kubenswrapper[4713]: I0127 16:06:12.557521 4713 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" podUID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerName="machine-config-daemon" containerID="cri-o://5a4605fe3c7a7fcac1b377a32c190a69b3f7ac27a6fa7dfc407ba60f17ec75a8" gracePeriod=600 Jan 27 16:06:13 crc kubenswrapper[4713]: I0127 16:06:13.295364 4713 generic.go:334] "Generic (PLEG): container finished" podID="9bd62e63-357b-4f16-a2a1-e6a1d2375808" containerID="5a4605fe3c7a7fcac1b377a32c190a69b3f7ac27a6fa7dfc407ba60f17ec75a8" exitCode=0 Jan 27 16:06:13 crc kubenswrapper[4713]: I0127 16:06:13.295445 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerDied","Data":"5a4605fe3c7a7fcac1b377a32c190a69b3f7ac27a6fa7dfc407ba60f17ec75a8"} Jan 27 16:06:13 crc kubenswrapper[4713]: I0127 16:06:13.295856 4713 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-6h5wz" event={"ID":"9bd62e63-357b-4f16-a2a1-e6a1d2375808","Type":"ContainerStarted","Data":"6a42bc5af1b44b085c52dbe93c249823b912e1c28f74673633ee157874a948d4"} Jan 27 16:06:13 crc kubenswrapper[4713]: I0127 16:06:13.295886 4713 scope.go:117] "RemoveContainer" containerID="560d0585a56a5323a4b025a75a383c0d30c7c995c13b04b0d45b84883ee39f33" var/home/core/zuul-output/logs/crc-cloud-workdir-crc-all-logs.tar.gz0000644000175000000000000000005515136161275024454 0ustar coreroot  Om77'(var/home/core/zuul-output/logs/crc-cloud/0000755000175000000000000000000015136161276017372 5ustar corerootvar/home/core/zuul-output/artifacts/0000755000175000017500000000000015136156502016511 5ustar corecorevar/home/core/zuul-output/docs/0000755000175000017500000000000015136156503015462 5ustar corecore